RSSAll Entries Tagged With: "survey process"

Lazy days of autumn: CMS does emergency management (cue applause)!

I suppose you could accuse me of being a little lazy in this week’s offering, but I really want you to focus closely on what the CMS surveyors are instructed to ask for in the Emergency Management Interpretive Guidelines (more on those here; seems like forever ago), so I’ve done a bit of a regulatory reduction by pulling out the non-hospital elements (I still think they could have done a better job with sorting this out for the individual programs) and then pulling out the Survey Procedures piece—that’s really where the rubber meets the road in terms of how this is going to be surveyed, at least at the front end of the survey process.

I suspect (and we only have all of recorded history to fall back on for this) that as surveyors become more comfortable with the process, they may go a little off-topic from time to time (surprise, surprise, surprise!), but I think this is useful from a starting point. As I have maintained right along, I really believe that you folks have your arms around this, even to the point of shifting interpretations. This is the stuff that they’ve been instructed to ask for, so I think this is the stuff that you should verify is in place (and, really, I think you’ll find you’re in very good shape). There’s a fair amount of ground to cover, so I will leave you to it—until next week!

BTW, I purposely didn’t identify which of the specific pieces of the Final Rule apply to each set of Survey Procedures. If there is a hue and cry, I will be happy to do so (or you can make your own—it might be worth it to tie these across to the requirements), but I think these are the pieces to worry about, without the language of bureaucracy making a mess of things. Just sayin’…

Survey Procedures

  • Interview the facility leadership and ask him/her/them to describe the facility’s emergency preparedness program.
  • Ask to see the facility’s written policy and documentation on the emergency preparedness program.
  • For hospitals and critical access hospitals (CAH) only: Verify the hospital’s or CAH’s program was developed based on an all-hazards approach by asking their leadership to describe how the facility used an all-hazards approach when developing its program.

Survey Procedures

  • Verify the facility has an emergency preparedness plan by asking to see a copy of the plan.
  • Ask facility leadership to identify the hazards (e.g., natural, man-made, facility, geographic, etc.) that were identified in the facility’s risk assessment and how the risk assessment was conducted.
  • Review the plan to verify it contains all of the required elements.
  • Verify that the plan is reviewed and updated annually by looking for documentation of the date of the review and updates that were made to the plan based on the review.

 

Survey Procedures

  • Ask to see the written documentation of the facility’s risk assessments and associated strategies.
  • Interview the facility leadership and ask which hazards (e.g., natural, man-made, facility, geographic) were included in the facility’s risk assessment, why they were included and how the risk assessment was conducted.
  • Verify the risk assessment is based on an all-hazards approach specific to the geographic location of the facility and encompasses potential hazards.

Survey Procedures

Interview leadership and ask them to describe the following:

  • The facility’s patient populations that would be at risk during an emergency event
  • Strategies the facility (except for an ASC, hospice, PACE organization, HHA, CORF, CMHC, RHC, FQHC and end stage renal disease (ESRD) facility) has put in place to address the needs of at-risk or vulnerable patient populations
  • Services the facility would be able to provide during an emergency
  • How the facility plans to continue operations during an emergency
  • Delegations of authority and succession plans

Verify that all of the above are included in the written emergency plan.

Survey Procedures

Interview facility leadership and ask them to describe their process for ensuring cooperation and collaboration with local, tribal, regional, state, and federal emergency preparedness officials’ efforts to ensure an integrated response during a disaster or emergency situation.

  • Ask for documentation of the facility’s efforts to contact such officials and, when applicable, its participation in collaborative and cooperative planning efforts.
  • For ESRD facilities, ask to see documentation that the ESRD facility contacted the local public health and emergency management agency public official at least annually to confirm that the agency is aware of the ESRD facility’s needs in the event of an emergency and know how to contact the agencies in the event of an emergency.

Survey Procedures

Review the written policies and procedures which address the facility’s emergency plan and verify the following:

  • Policies and procedures were developed based on the facility- and community-based risk assessment and communication plan, utilizing an all-hazards approach.
  • Ask to see documentation that verifies the policies and procedures have been reviewed and updated on an annual basis.

Survey Procedures

  • Verify the emergency plan includes policies and procedures for the provision of subsistence needs including, but not limited to, food, water and pharmaceutical supplies for patients and staff by reviewing the plan.
  • Verify the emergency plan includes policies and procedures to ensure adequate alternate energy sources necessary to maintain:

o Temperatures to protect patient health and safety and for the safe and sanitary storage of provisions;

o Emergency lighting; and,

o Fire detection, extinguishing, and alarm systems.

  • Verify the emergency plan includes policies and procedures to provide for sewage and waste disposal.

 

Survey Procedures

  • Ask staff to describe and/or demonstrate the tracking system used to document locations of patients and staff.
  • Verify that the tracking system is documented as part of the facilities’ emergency plan policies and procedures.

 

Survey Procedures

  • Review the emergency plan to verify it includes policies and procedures for safe evacuation from the facility and that it includes all of the required elements.
  • When surveying an RHC or FQHC, verify that exit signs are placed in the appropriate locations to facilitate a safe evacuation.

 

Survey Procedures

  • Verify the emergency plan includes policies and procedures for how it will provide a means to shelter in place for patients, staff and volunteers who remain in a facility.
  • Review the policies and procedures for sheltering in place and evaluate if they aligned with the facility’s emergency plan and risk assessment.

 

Survey Procedures

  • Ask to see a copy of the policies and procedures that documents the medical record documentation system the facility has developed to preserves patient (or potential and actual donor for OPOs) information, protects confidentiality of patient (or potential and actual donor for OPOs) information, and secures and maintains availability of records.

 

Survey Procedures

  • Verify the facility has included policies and procedures for the use of volunteers and other staffing strategies in its emergency plan.

 

Survey Procedures

  • Ask to see copies of the arrangements and/or any agreements the facility has with other facilities to receive patients in the event the facility is not able to care for them during an emergency.
  • Ask facility leadership to explain the arrangements in place for transportation in the event of an evacuation.

 

Survey Procedures

  • Verify the facility has included policies and procedures in its emergency plan describing the facility’s role in providing care and treatment (except for RNHCI, for care only) at alternate care sites under an 1135 waiver.

 

Survey Procedures

  • Verify that the facility has a written communication plan by asking to see the plan.
  • Ask to see evidence that the plan has been reviewed (and updated as necessary) on an annual basis.

 

Survey Procedures

  • Verify that all required contacts are included in the communication plan by asking to see a list of the contacts with their contact information.
  • Verify that all contact information has been reviewed and updated at least annually by asking to see evidence of the annual review.

 

Survey Procedures

  • Verify that all required contacts are included in the communication plan by asking to see a list of the contacts with their contact information.
  • Verify that all contact information has been reviewed and updated at least annually by asking to see evidence of the annual review.

 

Survey Procedures

  • Verify the communication plan includes primary and alternate means for communicating with facility staff, federal, state, tribal, regional and local emergency management agencies by reviewing the communication plan.
  • Ask to see the communications equipment or communication systems listed in the plan.

 

Survey Procedures

  • Verify the communication plan includes a method for sharing information and medical (or for RNHCIs only, care) documentation for patients under the facility’s care, as necessary, with other health (or care for RNHCIs) providers to maintain the continuity of care by reviewing the communication plan.

o For RNCHIs, verify that the method for sharing patient information is based on a requirement for the written election statement made by the patient or his or her legal representative.

  • Verify the facility has developed policies and procedures that address the means the facility will use to release patient information to include the general condition and location of patients, by reviewing the communication plan

 

Survey Procedures

  • Verify the communication plan includes a means of providing information about the facility’s needs, and its ability to provide assistance, to the authority having jurisdiction, the Incident Command Center, or designee by reviewing the communication plan.
  • For hospitals, CAHs, RNHCIs, inpatient hospices, PRTFs, LTC facilities, and ICF/IIDs, also verify if the communication plan includes a means of providing information about their occupancy.

 

Survey Procedures

  • Verify that the facility has a written training and testing (and for ESRD facilities, a patient orientation) program that meets the requirements of the regulation.
  • Verify the program has been reviewed and updated on, at least, an annual basis by asking for documentation of the annual review as well as any updates made.
  • Verify that ICF/IID emergency plans also meet the requirements for evacuation drills and training at §483.470(i).

 

Survey Procedures

  • Ask for copies of the facility’s initial emergency preparedness training and annual emergency preparedness training offerings.
  • Interview various staff and ask questions regarding the facility’s initial and annual training course, to verify staff knowledge of emergency procedures.
  • Review a sample of staff training files to verify staff have received initial and annual emergency preparedness training.

 

Survey Procedures

  • Ask to see documentation of the annual tabletop and full scale exercises (which may include, but is not limited to, the exercise plan, the AAR, and any additional documentation used by the facility to support the exercise.
  • Ask to see the documentation of the facility’s efforts to identify a full-scale community based exercise if they did not participate in one (i.e., date and personnel and agencies contacted and the reasons for the inability to participate in a community based exercise).
  • Request documentation of the facility’s analysis and response and how the facility updated its emergency program based on this analysis.

 

Survey Procedures

  • Verify that the hospital, CAH, and LTC facility has the required emergency and standby power systems to meet the requirements of the facility’s emergency plan and corresponding policies and procedures
  • Review the emergency plan for “shelter in place” and evacuation plans. Based on those plans, does the facility have emergency power systems or plans in place to maintain safe operations while sheltering in place?
  • For hospitals, CAHs, and LTC facilities which are under construction or have existing buildings being renovated, verify the facility has a written plan to relocate the EPSS by the time construction is completed

For hospitals, CAHs, and LTC facilities with generators:

  • For new construction that takes place between November 15, 2016 and is completed by November 15, 2017, verify the generator is located and installed in accordance with NFPA 110 and NFPA 99 when a new structure is built or when an existing structure or building is renovated.  The applicability of both NFPA 110 and NFPA 99 addresses only new, altered, renovated or modified generator locations.
  • Verify that the hospitals, CAHs and LTC facilities with an onsite fuel source maintains it in accordance with NFPA 110 for their generator, and have a plan for how to keep the generator operational during an emergency, unless they plan to evacuate.

 

Survey Procedures

  • Verify whether or not the facility has opted to be part of its healthcare system’s unified and integrated emergency preparedness program. Verify that they are by asking to see documentation of its inclusion in the program.
  • Ask to see documentation that verifies the facility within the system was actively involved in the development of the unified emergency preparedness program.
  • Ask to see documentation that verifies the facility was actively involved in the annual reviews of the program requirements and any program updates.
  • Ask to see a copy of the entire integrated and unified emergency preparedness program and all required components (emergency plan, policies and procedures, communication plan, training and testing program).
  • Ask facility leadership to describe how the unified and integrated emergency preparedness program is updated based on changes within the healthcare system such as when facilities enter or leave the system.

 

To close out this week’s bloggy goodness, Diagnostic Imaging just published a piece on emergency preparedness for radiology departments that I think is worth checking out: http://www.diagnosticimaging.com/practice-management/emergency-preparedness-radiology . Imaging services are such a critical element of care giving (not to mention one of the largest financial investment areas of any healthcare organization) that a little extra attention on keeping things running when the world is falling (literally or figuratively) down around your ears. I think we can make the case that integration of all hospital services is likely to be a key element of preparedness evaluation in the future—this is definitely worthy of your consideration.

Or the light that never warms

Continuing in our somewhat CMS-centric trajectory, I did want to touch upon one last topic (for the moment) as it portends some angst in the field. A couple of weeks ago (April 14, 2017, to be exact), the friendly folks at CMS issued notice of a proposed regulation change focusing on how Accrediting Organizations (AO) communicate survey results to the general public (you can find the details of the notice here).

At present, the various AOs do not make survey results and subsequent corrective action plans available to the general public, but apparently the intent is for that to change. So, using the Joint Commission data from 2016 as test data, it seems that a lot of folks are going to be highlighted in a manner that is not going to paint the prettiest picture. As we covered last week, hospitals and other healthcare organizations are not CMS’ customers, so their interest is pretty much solely in making sure that their customers are able to obtain information that may be helpful in making healthcare decisions. Returning to the Joint Commission data from last year, pretty much at least 50% of the hospitals surveyed will be “portrayed” as having issues in the environment (I’m standing by my prediction that those numbers are going to increase before they decrease—a prediction about which I will be more than happy to be incorrect). Now, the stated goal of this whole magillah is to improve the quality and safety of services provided to patients (can’t argue with that as a general concept), but I’m not entirely certain how memorializing a missed fire extinguisher check at an outpatient clinic or a missed weekly eyewash station check is going to help patients figure out where they want to obtain healthcare. So, I guess the question becomes one of how the folks we hire to assist with accreditation services (the folks for whom we are the customers) are going to share this information in the name of transparency? (Though I suppose if you were really diligent, it might be a little easier to discern trends in survey findings if you’re of a mind to dig through all the survey results.) It will be interesting to see how this plays out; I can’t imagine that they’d be able to publish survey results particularly quickly (I would think they would have to wait until the corrective action plan/evidence of standards compliance process worked itself through).

As with so many things related to the survey process, I understand what they are trying to do (begging the question: Is transparency always helpful?), but I’m not quite catching how this is going to help the process. I absolutely believe that the CMS and the AOs (could be a band name!) have a duty and an obligation to step in when patients are being placed at risk, as the result of care, environment, abuse, whatever. But does that extend to the “potential” of a process gap that “could” result in something bad happening—even in the presence of evidence that the risk is being appropriately managed? There always have been, and always will be, imperfections in any organization—and interpretations of what those imperfections may or may not represent. Does this process make us better or more fearful?

And to the surprise of absolutely no one…

Last week, the good folks at The Joint Commission announced the list of the five most challenging standards for hospitals surveyed during the first six months of 2016 (for those of you remaining reluctant to subscribe to the email updates, you can find the details for all accreditation programs here. For the purpose of this discussion, the focus will be on the hospital accreditation program—but if you want to talk detail specific to your organization—and you are not a hospital, just drop a line).

While there has been some jockeying for position (the once insurmountable Integrity of Egress is starting to fade a wee bit—kind of like an aging heavyweight champion), I think we can place this little grouping squarely in the realm of the management of the physical environment:

 

  • 02.06.01—safe environment
  • 02.02.01—reducing the risk of infections associate with medical equipment, devices and supplies
  • 02.05.01—utility systems risks
  • 02.01.20—integrity of egress
  • 02.01.35—provision and maintenance of fire extinguishing systems

I suspect that these will be a topic of conversation at the various and sundry TJC Executive Briefings sessions to be held over the next couple of weeks or so, though it is interesting to note that about while project REFRESH (the survey process’s new makeover) has (more or less) star billing (we covered this a little bit back in May) , they are devoting the afternoon to the physical environment, both as a straight ahead session helmed by George Mills, but also as a function of the management of infection control risks, with a crossover that includes Mr. Mills. I shan’t be a fly on the wall for these sessions (sometimes it’s better to keep one’s head down in the witless protection program), but I know some folks who know some folks, so I’m sure I’ll get at least a little bit of the skinny…

I don’t think we need to discuss the details of the top five; we’ve been rassling with them for a couple of years now and PEP or no PEP (more on the Physical Environment Portal in a moment), I don’t believe that there’s much in the way or surprises lurking within these most challenging of quintuplets (if you have a pleasant or unpleasant surprise to share, please feel free to do so). And therein, I think, lies a bit of a conundrum/enigma/riddle. As near as I can tell, TJC and ASHE have devoted a fair amount of resources to populating the PEP with stuff. LS.02.01.35 has not had its day in the port-ular sunshine yet,  but it’s next on the list for publication…perhaps even this month; not sure about IC.02.02.01, though I believe that there is enough crossover into the physical environment world, that I think it might be even be the most valuable portal upon which they might chortle. And it does not appear to have had a substantial impact on how often these standards are being cited (I still long for the days of the list of the 20 most frequently cited standards—I suspect that that list is well-populated with EC/LS/IC/maybe EM findings). As I look at a lot of the content, I am not entirely certain that there’s a lot of information contained therein that was not very close to common knowledge—meaning, I don’t know that additional education is going to improve thing. Folks know what they’re not supposed to do. And with the elimination of “C” performance elements and the Plans for Improvement process, how difficult is it going to be to find a single

  • penetration
  • door that doesn’t latch
  • sprinkler head with dust or paint on it
  • fire extinguisher that is not quite mounted or inspected correctly
  • soiled utility room that is not demonstrably negative
  • day in which temperature or humidity was out of range
  • day of refrigerator temperature out of range with no documented action
  • missing crash cart check
  • infusion pump with an expired inspection sticker
  • lead apron in your offsite imaging center that dodged its annual fluoroscopy
  • missed eyewash station check
  • mis- or unlabeled spray bottle
  • open junction box

 

I think you understand what we’re looking at here.

At any rate, I look at this and I think about this (probably more than is of benefit, but what can one do…), even if you have the most robust ownership and accountability at point of care/point of service, I don’t see how it is possible to have a reasonably thorough survey (and I do recognize that there is still some fair variability in the survey “experience”) and not get tapped for a lot of this stuff. This may be the new survey reality. And while I don’t disagree that the management of the physical environment is deserving of focus during the survey process, I think it’s going to generate a lot of angst in the world of the folks charged with managing the many imperfections endemic to spaces occupied by people. I guess we can hope that at some point, the performance elements can be rewritten to push towards a systematic management of the physical environment as a performance improvement approach. The framework is certainly there, but doesn’t necessarily tie across as a function of the survey process (at least no demonstrably so). I guess the best thing for us to do is to focus very closely on the types of deficiencies/imperfections noted above and start to manage them as data, but only to the extent that the data can teach us something we don’t know. I’ve run into a lot of organizations that are rounding, rounding, rounding and collecting scads of information about stuff that is broken, needs correction, etc., but they never seem to get ahead. Often, this is a function of DRIP (Data Rich, Information Poor) at this point, I firmly believe that if we do not focus on making improvements that are aimed at preventing/mitigating these conditions (again, check out that list above—I don’t think there’s anything that should come as a surprise), the process is doomed to failure.

As I tell folks all the time, it is the easiest thing in the world to fix something (and we still need to keep the faith with that strategy), but it is the hardest thing in the world to keep it fixed. But that latter “thing” is exactly where the treasure is buried in this whole big mess. There is never going to be a time when we can round and not find anything—what we want to find is something new, something different. If we are rounding, rounding, rounding and finding the same thing time after time after time, then we are not improving anything. We’re just validating that we’re doing exactly the opposite. And that doesn’t seem like a very useful thing at all…

Devilish details and the whirling dervishes of compliance

In the absence of any new content on The Joint Commission’s Physical Environment Portal (the PEP ain’t none too peppy of late), I guess we’re going to have to return to our old standby for the latest and greatest coming out of Chicago: Perspectives! The August Perspectives has a fair amount of content pertinent to our little circle, so it probably makes too much sense to cover those key items and announcements.

The front page headline (as it should be) relates the ongoing tale of the dearly departing PFI process (which, I suppose, kind of makes this something of an obituary). Effective August 1, 2016, open PFI items will no longer be reviewed by the survey team nor will they be included in the Final Report generated by the survey. All Life Safety chapter deficiencies will become Requirements for Improvement (RFI) with a 60-day submittal window for your Evidence of Standards Compliance (and remember, one of the other TJC practices that departed this year was the “C” performance elements, so all of those pesky Opportunities for Improvement (OFI) at the end of your past survey reports will now become RFIs). Also, only equivalency requests related to survey events will be reviewed. More on that part of the big picture in a moment.

Also in the August Perspectives comes the official print announcement that the requirements of the 2012 Life Safety Code® will not be surveyed until November 1, 2016 (which should make for a very interesting few months in survey land for those of you moving towards the “closing” of your survey window), giving everyone on the regulatory compliance team a chance to complete the online education program, and give CMS time to update the survey forms and K-Tags. Apparently, the self-directed education program takes about 20 hours to complete (you can see the entire CMS memorandum here). The education program includes a pre- and post-test, and requires a passing score of 85%. I’m kind of curious about the format (I’m thinking perhaps the classic multiple choice format) and even more curious about whether they would ever make such a thing available to safety and facilities professionals. Presumably this means that whoever comes to your door on Tuesday, November 1 to survey your building will have passed the test. Would it be rude to ask them how they fared?

Next we turn to the “Clarifications and Expectations” column which, for all intents and purposes, is something of a recap of the PFI stuff, with the additional indication that TJC will no longer offer extensions and the automatic six-month grace period is no longer available. Ostensibly, this means that those of you with open PFIs had probably better start cleaning things up. I’m still waiting to see something (anything?) on the subject of the inaccessible fire and smoke dampers; I think I’ve mentioned previously of instances in which CMS has forced the issue of correcting the dampers, but I can’t help but think that that could be a very big pain in the posterior for some folks. I’d like to think that if these were simple to fix, they would already have been corrected (we wouldn’t take advantage of the process, would we?) so this could create a fairly burdensome situation for folks.

For those archivists among you, there is some interesting background on the 60-day time limit. Section §488.28(d) of the Code of Federal Regulations states: “Ordinarily a provider or supplier is expected to take the steps needed to achieve compliance within 60 days of being notified of the deficiencies, but the State survey agency may recommend that additional time be granted by the Secretary in individual situations, if in its judgment, it is not reasonable to expect compliance within 60 days, for example, a facility must obtain the approval of its governing body, or engage in competitive bidding.” Now that does provide a little sense of what will “fly” if one is forced to ask for a time-limited waiver (TLW—another acronym for the alphabet soup of compliance), but it’s tough to say whether any flexibility extends beyond those elements (who would ever have thought that competitive bidding might be helpful!).

Anyway, one thing relating to the SOC/PFI maelstrom (at least tangentially—and not mentioned in the August Perspectives) is the question of whether or not the presentation of the categorical waivers at the beginning of the survey process is still required. Certainly, the effective adoption date of the 2012 LSC (July 5, 2016) might potentially be the tipping point for informing the survey team of any categorical waivers your organization might have adopted, but I think the most appropriate cutoff date (if you will) for this practice would be on November 1, 2016 when CMS (and its minions) are charged with surveying to the requirements of the 2012 LSC. My overarching thought in this regard is that presenting the waivers to the survey team at the start of the survey certainly doesn’t hurt you and since the 2000 edition of the LSC is still the primary survey reference, it seems most appropriate to continue highlighting the waivers for the time being.

Back to Perspectives: One final EC-related item, for those of you with memory care units, there is specific coverage of the expectations under EC.02.06.01 relative to patient stimulation (or overstimulation), outdoor spaces for patients and residents with dementia, and other environmental elements. While these requirements apply to the Memory Care Certification chapter of the Nursing Care Center manual, again, if you happen to have a memory care unit within your span of control, you might find these expectations/performance elements useful in managing the environment. Even when not required, sometimes there are elements worth considering. After all, improving the patient experience as a function of the physical environment is one of our most important charges.

Blame it on Cain…

We’ll see how long this particular screed goes on when we get to the end…

In my mind (okay, what’s left of it), the “marketing” of safety and the management of the physical environment is an important component of your program. I have also learned over time that it is very rare indeed when one can “force” compliance onto an organization. Rather, I think you have to coax them into seeing things your way. At this point, I think we can all agree that compliance comes in many shapes, colors, sizes, etc., with the ideal “state” of compliance representing what it is easiest (or most convenient) for staff to do. If we make compliance too difficult (both from a practical standpoint, as well as the conceptual), we tend to lose folks right out of the gate—and believe you me—we need everybody on board for the duration of the compliance ride.

For instance, I believe one of the cornerstone processes/undertakings on the compliance ride is the effectiveness of the reporting of imperfections in the physical environment (ideally, that report is generated in the same moment—or just after—the imperfection “occurs”). There are few things that frustrate me more than a wall that was absolutely pristine the day before, and is suddenly in possession of a 2- to 3-inch hole! There’s no evidence that something bored out of the wall (no debris on the floor under the hole), so the source of the hole must have been something external to the hole (imagine that!). So you go to check and see if some sort of notification had occurred and you find out, not so much. Somebody had to be there when it happened and who knows how many folks had walked by since its “creation,” but it’s almost like the hole is invisible to the naked eye or perhaps there’s some sort of temporal/spatial disruption going on—but I’m thinking probably not.

I’m reasonably certain that one can (and does) develop an eye/sense for some of the more esoteric elements of compliance (e.g., the surveyor who opens a cabinet drawer, reaches in, and pulls out the one expired item in the drawer), but do we need to educate folks to recognize holes in the wall as something that might need a wee bit of fixing? It would seem so…

At any rate, in trying to come up with some sort of catch phrase/mantra, etc., to promote safety, I came up with something that I wanted to share with the studio audience. I’d appreciate any feedback you’d be inclined to share:

WE MUST BE ABLE:

CAPABLE

RELIABLE

ACCOUNTABLE

SUSTAINABLE

I’m a great believer in the power of the silly/hokey concept when you’re trying to inspire folks; when you think of the most memorable TV ads, the ones that are funny tend to be the most memorable in terms of concept and product (the truly weird ads are definitely memorable, but more often than not I couldn’t tell you what product was being advertised). I think that as a four-part vision, the above might be pretty workable. What do you think?

Reducing the length of stay: Not yours, but somebody who visits but once in a three-year cycle…

One of the most interesting parts of my job is helping folks through the actual Joint Commission survey process. Even as a somewhat distant observer, I can’t help but think that the average survey (in my experience) is about a day longer than it needs to be. Now, I recognize that some of that on-site time is dedicated to entering findings into the computer, so I get that. But there are certain parts of the process, like, oh I don’t know, the EC/EM interview session, that could be significantly reduced, if not dispensed with entirely. Seriously, once you’ve completed the survey of the actual environment, how much more information might you need to determine whether an organization has its act together?

At any rate, I suppose this rant is apropos of not very much, but the thought does occur to me from time to time. So I ask you: is there anybody out there who feels the length of the survey was just right or, heaven forbid, not long enough? As I’ve always maintained, TJC (or, for that matter any regulatory survey type—including consultants) tend to look their best when you see them in the rear view mirror as you drive off into the future. I know the process is intended to be helpful on some level, but somehow, the disruption never seems to result in a payoff worth the experience. But hey, that may just be me…

Any thoughts you’d like to share would be most appreciated.

A change will do you good…but what about no change? Exact change?

I’m sure you’ve all had a chance to look over the April 2014 issue of Perspectives, in which EC and LS findings combined to take seven of the top 10 most frequently cited standards during 2013, with issues relating to the integrity of egress taking the top spot.

At this point, I don’t think there are any surprises lurking within those most frequently occurring survey vulnerabilities (if someone out there in the audience has encountered a survey finding that was surprising, I would be most interested in hearing about it). The individual positions in the Top 10 may shift around a bit, but I think that it’s pretty clear that, at the very least, the focus of the TJC survey process has remained fairly constant these past couple of years.

Generally speaking, my sense about the TJC survey cycle is that specific focus items tend to occur in groups of threes (based on the triennial survey cycle, with the assumption being that during each three year period, every hospital would be surveyed—and yes, I do know what happens when you assume…) and I think that 2013 may well represent the end of the first go-round of the intensive life safety survey process (I really believe that 2009-2010 were sort of beta-testing years). So the question I have for you good citizens of the safety world: Has anyone been surveyed yet this year? With follow-up questions of:

  • Did you feel you were better prepared to manage the survey process this time?
  • Was the survey process different this time?
  • More of the same?
  • More difficult?
  • Less difficult?

I’m hoping to get a good sense of whether the tidal wave of EC/LS findings has indeed crested, so anyone interested in sharing would have my gratitude. Please feel free to respond to the group at large by leaving a comment here or if you prefer a little more stealthy approach, please e-mail me at smacarthur@greeley.com or stevemacsafetyspace@gmail.com.

Prioritize this…

During a recent survey, an interesting question was posed to the folks in Facilities, a question more than interesting enough to bring to your attention. The folks were asked to produce a policy that describes how they prioritize corrective maintenance work orders and they, in turn, asked me if I had such a thing. In my infinitely pithy response protocol, I indicated that I was not in the habit of collecting materials that are not required by regulatory standard. Now, I’m still not sure what the context of the question might have been (I will be visiting with these folks in the not too distant future and I plan on asking about the contextual applications of such a request), but it did give me cause to ponder the broader implications of the question.

I feel quite confident that developing a simple ranking scheme would be something that you could implement without having to go the whole policy route (I am personally no big fan of policies—they tend to be more complicated than they need to be and it’s frequently tougher to follow a policy 100% of the time, which is pretty much where the expectation bar is set during survey). I think something along the lines of:

Priority 1 – Immediate Threat to Health/Safety

Priority 2 – Direct Impact on Patient Care

Priority 3 – Indirect Impact on Patient Care

Priority 4 – No patient care impact

Priority 5 – Routine repairs

would work pretty well under most, if perhaps not all, circumstances. The circumstance I can “see” that might not quite lend itself to a specific hierarchy is when you have to run things on a “first come, first served” basis. Now I recognize that since our workforces are incredibly nimble (unlike regulatory agencies and the like), we can re-prioritize things based on their impact on important processes, so the question I keep coming back to is how can a policy ever truly reflect the complexities of such a process without somehow ending up with an “out of compliance with your policy” situation? This process works (or I guess in some instances, doesn’t) because of the competence of the staff involved with the process. I don’t see where a policy gets you that, but what do I know?

You may want to smoke during surveys

I could have sworn that I had covered this last year, but I can find no indication that I ever got past the title of this little piece of detritus, so I guess better late than never.

One of the more interestingly painful survey findings that I’ve come across hinge on the use of a household item that previously had caused little angst in survey circles—I speak of the mighty tissue paper! There has been any number of survey dings resulting from tissue paper either being blown or sucked in the wrong direction, based on whether a space is supposed to be positive or negative. And this lovely little finding has generated a fair amount of survey distress as it usually (I can’t say all, but I don’t know of this coming up in a survey in which the following did not occur) drives a follow-up visit from CMS as a Condition-level finding under Physical Environment/Infection Control.

The primary “requirements” in this regard reside under A-Tag 0726 and can be found below. Now I’m thinking that tissue paper might not be the most efficacious measure of pressure relationships, which (sort of—give me a little leeway here) begs the question of whether you should be prepared to “smoke” the doorway/window/etc. for which the tissue paper might not be as sensitive to the subtleties of pressures. I think it’s a reasonable thing to plan for—as much because there can be a whole lot at stake.  So, I’ll ask you to review the materials below and be prepared to discuss…

A-0726

(Rev. 37, Issued: 10-17-08; Effective/Implementation Date: 10-17-08)

§482.41(c)(4) – There must be proper ventilation, light, and temperature controls in pharmaceutical, food preparation, and other appropriate areas.

Interpretive Guidelines §482.41(c)(4)

There must be proper ventilation in at least the following areas:

• Areas using ethylene oxide, nitrous oxide, glutaraldehydes, xylene, pentamidine, or other potentially hazardous substances;

• Locations where oxygen is transferred from one container to another;

• Isolation rooms and reverse isolation rooms (both must be in compliance with Federal and State laws, regulations, and guidelines such as OSHA, CDC, NIH, etc.);

• Pharmaceutical preparation areas (hoods, cabinets, etc.); and

• Laboratory locations.

 

There must be adequate lighting in all the patient care areas, and food and medication preparation areas.

Temperature, humidity and airflow in the operating rooms must be maintained within acceptable standards to inhibit bacterial growth and prevent infection, and promote patient comfort. Excessive humidity in the operating room is conducive to bacterial growth and compromises the integrity of wrapped sterile instruments and supplies. Each operating room should have separate temperature control. Acceptable standards such as from the Association of Operating Room Nurses (AORN) or the American Institute of Architects (AIA) should be incorporated into hospital policy.

The hospital must ensure that an appropriate number of refrigerators and/or heating devices are provided and ensure that food and pharmaceuticals are stored properly and in accordance with nationally accepted guidelines (food) and manufacturer’s recommendations (pharmaceuticals).

Survey Procedures §482.41(c)(4)

• Verify that all food and medication preparation areas are well lighted.

• Verify that the hospital is in compliance with ventilation requirements for patients with contagious airborne diseases, such as tuberculosis, patients receiving treatments with hazardous chemical, surgical areas, and other areas where hazardous materials are stored.

• Verify that food products are stored under appropriate conditions (e.g., time, temperature, packaging, location) based on a nationally-accepted source such as the United States Department of Agriculture, the Food and Drug Administration, or other nationally-recognized standard.

• Verify that pharmaceuticals are stored at temperatures recommended by the product manufacturer.

• Verify that each operating room has temperature and humidity control mechanisms.

• Review temperature and humidity tracking log(s) to ensure that appropriate temperature and humidity levels are maintained.

 

Kind of vague, yes indeedy do! Purposefully vague—all in the eye of the beholder. Lots of verification and ensuring work, if you ask me, but this should give you a sense of some of the things about which you might consider focusing a little extra attention.

He ain’t HVA, he’s my opportunity

An interesting topic came across my desk relative to a January 2013 survey, and it pertains to the use of your HVA process as a means of driving staff education initiatives.

During the Emergency Management interview session during this particular survey, the surveyor wanted to know about the organization’s hazard vulnerability analysis (HVA) process and how it worked. So, that’s pretty normal—there are lots of ways to administer the HVA process—I prefer the consensus route, but that’s me.

But then the follow-up question was “How do you use the HVA to educate staff and their actions to take?” Now, when I first looked at that, I was thinking that the HVA process is designed more as a means of prioritizing response activities, resource allocations, and communications to local, regional, and other emergency response agencies, etc., but staff education? Not really sure about that…

But the more I considered the more I thought to myself, if you’re going to look at vulnerability as a true function of preparedness, then you would have to include the education of staff to their roles and responsibilities during an emergency as a critical metric in evaluating that level of preparedness. The HVA not only should tell you where you are now, but also give you a sense of where you need to take things to make improvements and from those improvements, presumably there will be some element of staff education. A question I like to ask of folks is: “What is the emergency that you are most likely to experience for which you are least prepared?” Improvement does not usually reside in things you already do well/frequently. It’s generally the stuff that you don’t get to practice as often that can be problematic during real-life events. One example is the management of volunteer practitioners—this can be a fairly involved process. But if you haven’t practiced it during an exercise, there may be complexities that will get in the way of being able to appropriately respond during the emergency. Which is why I recommend if you haven’t practiced running a couple of folks through the volunteer process, what better time than during an exercise?