RSSAll Entries Tagged With: "survey preparation"

Dance on a volcano: Keeping tabs on those that keep tabs on us…

As we’ve discussed in the past, the world in which we exist—and the stories and challenges contained therein—is never ending. And the subtext of that constancy revolves around our efforts to stay (as it were) one step ahead of the sheriff.

Part of me is railing against my chosen topic this week because I always feel like this space can (and, admittedly, does) have a tendency towards a Joint Commission-centric vision of the compliance universe, but while they may not be the largest primate in the compliance universe (once again violating all manner of metaphoric-mixing indignities), they are (more or less) the organization with the most robust customer-forward presence, through Perspectives to the FAQ pages to the topic-specific offerings we’re covering this week. All things being equal (which, of course, they never really are), I would encourage you to poke around a bit on these sites as there is a mix of stuff that is almost ancient, but some tools, etc. that you might find useful in demonstrating compliance.

The Physical Environment portal is kind of the granddaddy of this whole construct; it started out as a collaboration with the American Society for Health Care Engineering (and may very well continue to be so, but it’s kind of tough to tell) with the goal of providing information on the most frequently cited standards. Unfortunately (for me, but not so much for you), a lot of the information, including “surveyor insights,” is accessible only through your organization’s TJC extranet portal, but there is some stuff that’s worth a look. For example, there is a fire drill matrix that gives a sense of what areas should be considered for your high-risk fire drills (or would it be fire drills in high-risk areas…); the one on the matrix I found of some interest was Cath/EP lab making the high-risk list. I guess the overarching thought is to make sure you carefully consider those areas in which surgical fires a present as a risk.

There are also portals for emergency management, healthcare-acquired infections (I would keep a close eye on that one; lots of indication that this is the next “big thing” for survey), and workplace violence. Keep an eye on them: You never know what might pop up!

Who remembers pop-o-matic Trouble?

In something of a variation on another bloggy evergreen, I ask the rhetorical question: To what, if any, extent have you included consideration of  board games in your physical environment risk assessments for behavioral health? As I think towards a generation (are they already here?) for which the glories of board games will be forever lost, our friends in Chicago offer the latest challenge in managing risks with our all-too-vulnerable patient populations (for those of you of a certain vintage, the description of a board game is very nearly worth the price of admission).

The article describes the quite inventive use of a plastic board game piece to defeat the reptilian tamper-resistant screws and suggest some alternative products that do not so easily surrender to such efforts. I don’t know that I’ve been privy to a lot of discussion relative to board games in the behavioral health setting, but I suppose this would come under the heading of “everything has an inherent, though perhaps not apparent, risk.” Based on some recent surveys, it seems that Joint Commission surveyors have been rather inventive in looking for physical environment elements that have not been specifically accounted for in the assessment process. The classic example is including medical beds in the risk assessment, but not specifically mentioning the risks associated with the ligature-resistance (or not) of the side rails, bed frame, etc. Sooooooo, if they have not yet been included in your risk assessment activities, it might be a good time to pull a little group together and ponder the use of board games (and perhaps other such items) as a function of the behavioral health physical environment risk assessment.

Should we think about Halloween candy as well?!?

Check and mate!

Making a checklist, making it right: Reducing compliance errors

As you may have noticed, I am something of a fan of public radio (most of my listening in vehicles involves NPR and its analogues) and every once in a while, I hear something that I think would be useful to you folks out in the field. One show that I don’t hear too often (one of the things about terrestrial radio is that it’s all in the timing) is called “Hidden Brain”, the common subject thread being “A conversation about life’s unseen patterns.” I find the programs to be very thought-provoking, well-produced, and generally worth checking out.

This past weekend, they repeated a show from 2017 that described Dr. Atul Gawande’s (among others) use of checklists during surgical (and other) procedures to try to anticipate what unexpected things could occur based on the procedure, where they were operating, etc. One of the remarks that came up during the course of the program dealt with how extensive a checklist one might need, with the overarching thought being that a more limited checklist tends to work better because it’s more brain-friendly (I’m paraphrasing quite a bit here) than a checklist that goes on for pages and pages. I get a lot of questions/requests for tools/checklists for doing surveillance rounds, etc. (to be honest, it has been a very long time since I’ve actually “used” a physical checklist; my methodology, such as it is, tends to involve looking at the environment to see what “falls out”). Folks always seem a little disappointed when the checklist I cough up (so to speak) has about 15-20 items, particularly when I encourage them not to use all the items. When it comes to actual checklists that you’re going to use (particularly if you’re going to try and enlist the assistance of department-level folks) for survey prep, I think starting with five to seven items and working to hardwire those items into how folks “see” the environment is the best way to start. I recall a couple of years ago when first visiting a hospital—every day each manager was charged with completing a five-page environmental surveillance checklist—and I still was able to find imperfections in the environment (both items that they were actually checking on and a couple of other items that weren’t featured in the five-pager and later turned out to be somewhat important). At the point of my arrival, this particular organization was (more or less) under siege from various regulatory forces and were really in a state of shock (sometimes a little regulatory trouble is like exsanguination in shark-infested waters) and had latched on to a process that, at the end of the day, was not particularly effective and became almost like a sleepwalk to ensure compliance (hey, that could be a new show about zombie safety officers, “The Walking Safe”).

At any rate, I think one of the defining tasks/charges of the safety professional is to facilitate the participation of point-of-care/point-of-service folks by helping them learn how to “see” the stuff that jumps out at us when we do our rounds. When you look at the stuff that tends to get cited during surveys (at least when it comes to the physical environment), there’s not a lot of crazy, dangerous stuff; it is the myriad imperfections that come from introducing people into the environment. Buildings are never more perfect than the moment before occupancy—after that, the struggle is real! And checklists might be a good way to get folks on the same page: just remember to start small and focus on the things that are most likely to cause trouble and are most “invisible” to folks.

You might have succeeded in changing: Using the annual evaluation to document progress!

I know some folks use the fiscal year (or as one boss a long time ago used to say, the physical year) for managing their annual evaluation process, but I think most lean towards the calendar year. At any rate, I want to urge you (and urge you most sincerely) to think about how you can use the annual evaluation process to demonstrate to leadership that you truly have an effective program: a program that goes beyond the plethora of little missteps of the interaction of humans and their environment. As we continue to paw through the data from various regulatory sources, it continues to be true more often than not that there will be findings in the physical environment during your organization’s next survey. In many ways, there is almost nothing you can do to hold the line at zero findings, so you need to help organizational leadership to understand the value of the process/program as a function of the management of a most imperfect environment.

I think I mentioned this not too long ago: I was probably cursing the notion of a dashboard that is so green that you can’t determine if folks are paying attention to real-life considerations or if they’re just good at cherry-picking measures/metrics that always look good. But as a safety scientist, I don’t want to know what’s going OK, I want to know about what’s not going OK and what steps are being taken to increase the OK-ness of the less than OK (ok?!?). There are no perfect buildings, just as there are no perfect organizations (exalted, maybe, but by no means perfect) and I don’t believe that I have ever encountered a safety officer that was not abundantly aware of the pitfalls/shortcomings/etc. within their organizations, but oh so often, there’s no evidence of that in the evaluation process (or, indeed, in committee minutes). It is the responsibility of organizational leadership to know what’s going on and to be able to allocate resources, etc., in the pursuit of excellence/perfection; if you don’t communicate effectively with leadership, then your program is potentially not as high-powered as it could be.

So, as the year draws to a close, I would encourage you to really start pushing down on your performance measures—look at your thresholds—have you set them at a point for which performance will always be within range. Use the process to drive improvement down to the “street” level of your organization—you’ve got to keep reaching out to the folks at point of care/point of service—in a lot of ways they have the most power to make your job easier (yeah, I know there’s something a little counterintuitive there, but I promise you it can work to your benefit).

At any rate, at the end of the process, you need to be able to speak about what you’ve improved and (perhaps most importantly) what needs to be improved. It’s always nice to be able to pat yourself on the back for good stuff, but you really need to be really clear on where you need to take things moving forward.

Education < / = / > Achievement: Don’t Let Survey Prep Get in the Way of Good Sense

I’d like to start off this week with an interesting (and hopefully instructive) tale from the field:

I was doing some work recently at an organization that is facing down the final six months of its survey window. This was my first visit to the facility and I was working on getting a sense of the place as well as identifying the usual list of survey vulnerabilities. As we’ve discussed before, one of the things that’s always in the mix, particularly with the gang from Chicago, is the care and feeding of emergency eyewash stations. This particular organization has adopted the strategy of having folks at the department level perform the weekly testing (a sensible approach from my standpoint—I think the most important piece of the weekly testing is helping to ensure that folks who might actually need the eyewash in an emergency actually know how the darn thing works), but the documentation form had two columns: one for the date and one for the signature of the person doing the test. The sheet did not, however, have any instructions on it, which prompted me to inquire as to how folks would know what (and why) they are checking, since the purpose is not just to run the water. The response to my inquiry was rather noncommittal, which is not that unusual, so I continued to collect data relative to the process. So, over the course of the facility tour, we found a couple of eyewashes with missing caps and no clear indication on the testing form that this had been identified as an issue. OK, not crazily unusual, but pointing towards a process that could use some tweaking. A couple of eyewashes with obstructed access provided a little more data.

Then we made our way to the kitchen. No real compliance issues with the eyewash itself, but I noted that they were checking the eyewash station on a daily basis and recording the temperature at that same frequency. Now, the ANSI standard does not require daily verification of eyewash flushing fluid temperature, so I asked about this particular practice (BTW: Nowhere else had we seen this practice—at least not yet …) and was informed another hospital in the region had been cited for not doing the daily temp checks (I have not been able to verify that this was an actual survey finding, but sometimes believing is enough … to cause trouble). And then we headed over to the lab and ran into a similar practice (they were just verifying the temps during the weekly test) and the feedback there was that a College of American Pathologists (CAP) surveyor had told them a story about an individual that had suffered eye damage because the (low temperature) water from the eyewash interacted with a chemical. This was not written up as a finding, but was relayed as an anecdotal recommendation.

The “funny” thing about all this (actually, there are a couple of process gaps) is that each of the eyewash stations in question are equipped with mixing valves, which pretty much mitigates the need for daily or weekly temperature checks (you want to check the temp when you’re doing the annual preventive maintenance activity). But the more telling/unfortunate aspect of this is that (independent of each other) these folks had unilaterally adopted a process modification that was not in keeping with the rest of the organization (it has been said, and this is generally true, that you get more credit for being consistently wrong than inconsistently right). Now, one of the big truisms of the survey process is that is almost impossible to push back when you are not compliant with your own policy/practice. And while I absolutely appreciate (particularly when the survey window is closing) wanting to “do the right thing,” it is of critical importance to discuss any changes (never mind changes in the late innings) with the folks responsible for the EOC program. While I pride myself on not telling folks that they have to do something that is not specifically required by code or regulation, some of the regulatory survey folks don’t share that reticence. The other potential dynamic for these “mythical” requirements is when a surveyor tells an organization something that doesn’t show up in the actual report. I run into this all the time—they may “look” at the finding in the report, but what they sometimes react to is what the surveyor “said.” Compliance has way more than 50 shades of whatever color you care to designate and what works/worked somewhere else doesn’t always work everywhere, so folks make these changes without knowing what is actually required and end up increasing the potential for a survey finding.

And healthcare isn’t the only pursuit in which incomplete communications (or making sure that communications are as complete as they can be) can have an impact. At the moment, I am reading An Astronaut’s Guide to Life on Earth by Col. Chris Hadfield (this, apparently, is going to be the summer for reading astronaut memoirs, be that as it may) and I came across a passage in which Hadfield describes a debriefing following a practice spacewalk in which one of the instructors noted that while Hadfield has a “very clear and authoritative manner,” he encouraged the folks participating in the debrief to not be “lulled into a feeling of complete confidence that he’s right.” As soon as I saw that, I was able to tie it back to the management of surveyors who speak in a “very clear and authoritative manner” and sometimes turn out not to be worthy of complete confidence that the surveyor is correct. If you are doing something that, in good faith and the extent of your knowledge, is the “right thing” and somebody (even me!) comes along and says you’re not doing that right, never be afraid to ask to see where it says that in the code/regulation, etc. (BTW: I’m not giving you permission to be obnoxious about it!) Surveyors (same for consultants) see a lot of stuff and sometimes compliance becomes a fixed idea, or process, in their head, but that doesn’t mean it’s the only way. And if you hear something that makes you think you have a vulnerability (something you’ve heard through that pesky grapevine), talk it out before you make any changes. That gives everyone in your organization a fighting chance at compliance.

As a final note, if you’ve forgotten about Col. Hadfield’s most notable performance (beyond the astronaut thing), check it out:

Documentary evidence: Sounds like you’re going to have to push a little more paper next survey!

A few weeks ago, our friends in Chicago upped the ante in releasing the updated documentation list for the Life Safety portion of the survey (you can find it—and I really, really, really suggest that you do so sooner rather than later—by logging into your Joint Commission portal and the clicking through the following internal links: > Survey Process, > Survey Activity Guide, > Additional Resources). And this is definitely a case of the list having shifted towards documentation of activities and conditions for which folks have been struggling to get in line. Now, from anecdotal discussions with folks, there’s not always a ton of time available for document review. So, in a lot of instances, the focus is on inspection, testing and maintenance of fire alarm and suppression systems equipment, emergency and standby power supply systems, medical gas and vacuum systems, with some “drift” into fire drills and other more or less standard areas of concern/coverage, including the management plans (sometimes—and those don’t appear to have earned a mention on the updated list).

However, according to that same updated document list, looks like a lot of focus on inventory lists (operating components of utility systems; high-risk operating components on your inventory, infection control components); “embracing” (you can think of that as reviewing and adopting) manufacturer recommendations for inspection, testing and maintenance of utility systems or outlining the Alternative Equipment Maintenance program being used. And the same types of things for medical equipment—inventory, high risk equipment, consideration of manufacturer recommendations, etc. It also appears that there will be focus on sterilizer inspection, testing, and maintenance; compliance of your hyperbaric facilities (if you have them) with Chapter 14 of NFPA 99-2012; testing manual transfer switches in your emergency power supply system. Let’s see, what else…oh yes, for those of you with recently (I’m guessing that pesky July 6, 2016 date is the key point in time) constructed or renovated procedural areas, you need to make sure that you have (and are testing) task lighting in deep sedation and general anesthesia areas (the annual testing requirement is for a 30-minute test).

I’m sure there’s other stuff that will pop to the surface as we move through this next phase of the survey process; I’m curious about how much in-depth looking they’re going to be able to do and still be able to get to the lion’s share of your building (unless they start using unmanned drones…). I’m also curious that they don’t specifically indicate the risk assessment identified in Chapter 4 of NFPA 99-2012 (it has been asked for during CMS surveys), but that may be for the next iteration. Part of me can’t help but think back to those glory days when we wished for adoption of the 2012 Life Safety Code®; I guess we can take full advantage of the operational flexibilities inherent in suite configuration and a couple more things, but it never really seems to get any easier, does it?

At any rate, please hop on your organization’s TJC portal and give the updated list a look. If you see something that gives you hives, sing out: we’re all here to help!

Not enough rounding in the world: Compliance and readiness in the face of everyday chaos…

As I was engaged in my walk this morning (the sun just starting to cast its light on the Rockies!), I was pondering the complexities of the healthcare environment as a function of compliance. One of the truisms of my practice is that I am good at finding those points where things don’t quite gel. Sometimes (most times, to be honest), it’s relatively minor stuff (which we know is where most of the survey findings “live”) and every once in a while (mostly because my eyes are “fresh” and can pick out the stuff that’s happened over time; as I like to say, squalor happens incrementally), you find some bigger vulnerabilities (maybe it’s a gap in tracking code changes or a process that’s really not doing what you need it to do). So, after tooling around for a couple of days, folks will inevitably ask me “what do you look for?” and I will stumble through something like “I try to find things that are out of place” or something like that.

This morning, I had something of an epiphany in how that question actually informs what I do: it’s not so much what I look for, it’s what I look “at.” And that “at,” my friends, is everything in a space. One of the process element that gets drilled into housekeeping folks (I’m pretty sure this is still the case, it definitely was back in 1978 when I started this journey), is to check your work before you go on to the next thing, and that means going back over everything you were supposed to do. I’ve had conversations with folks about what tools I’ve seen that have been effective (and I do believe in the usefulness of tools for keeping track of certain problematic or high-risk conditions), but only in very rare circumstances have I “relied” on a tool because I have an abject fear of missing something critical because I had a set of queries, if you will. I would submit to you that, from a compliance standpoint, there are few more complex environments in which to provide oversight than healthcare. It is anything but static (almost everything except for the walls can move—and does!) and in that constant motion is the kernel of complication that makes the job of facilities safety professional infinitely frustrating and infinitely rewarding.

So, I guess what I’m advising is not to limit your vision to “for,” but strive for “at everything—and if you can impart that limitless vision to the folks who occupy your organization’s environment, you will have something quite powerful.

 

Who can turn the world on with her smile?

As we find 2017 reapplying time’s onslaught against pop culture icons, once again there’s a small “c” cornucopia of stuff to cover, some perhaps useful, some most assuredly not (that would be item #1, except for the advice part). Allons-y!

As goes the passage of time, so comes to us the latest and latest edition of the Joint Commission’s Survey Activity Guide (2017 version). There does not appear to be a great deal of shifting in the survey sands beyond updating the Life Safety Code® (LSC) reference, reordering the first three performance elements for the Interim Life Safety Measure (ILSM) standard, and updating the time frame for sprinkler system impairments before you have to consider fire watches, etc. They also recommend having an IT representative for the “Emergency Management and Environment of Care and Emergency Management” (which makes EM the function so nice they named it twice…), which means that, yes indeedy, the emergency management/environment of care “interviews” remain on the docket (and review of the management plans and annual evaluations—oh, I wish those plans would go the way of the dodo…) for the building tour as well. Interestingly enough, there is no mention of the ILSM assessment discussion for any identified LSC deficiencies (perhaps that determination was made to late in the process)—or if there is, I can’t find it. So for those of you entertaining a survey this year, there’s not a ton of assistance contained therein. My best advice is to keep an eye on Perspectives—you know the surveyors will!

And speaking of which, the big news in the February 2017 issue of Perspectives is the impending introduction of the CMS K-tags to the Joint Commission standards family. For those of you that have not had the thrill of a CMS life safety survey, K-tags are used to identify specific elements of the LSC that are specifically required by CMS. Sometimes the K-tags line up with the Joint Commission standards and performance elements and sometimes they provide slightly different detail (but not to the point of being alternative facts). As TJC moves ever so closely to the poisoned donut that is the Conditions of Participation, you will see more and more readily discernible cross-referencing between the EC/LS (and presumably EM) worlds. At any rate, if I can make one consultative recommendation from this whole pile of stuff, I would encourage you to start pulling apart Chapter 43 of the 2012 LSC – Building Rehabilitation, particularly those of you that have been engaged in the dark arts of renovation/upgrading of finishes, etc. You want to be very clear and very certain of where any current or just-completed projects fall on the continuum—new construction is nice as a concept (most new stuff is), but new construction also brings with it requirements to bring things up to date. This may all be much ado about little, but I’d just as soon not have to look back on 2017 as some catastrophic survey year, if you don’t mind…

Until next time, have a Fabulous February!

Don’t ask, don’t tell, don’t tell, don’t get in trouble…

Hope everyone is having a good week and that the rather stormy weather impacting so many parts of the country has not created too much of a challenge for you and your organizations.

This week is another (sort of) catch-all of topics, starting first with a little bit of CYA advice.

Lately there have been several instances (of which I am aware—can’t say for sure if this is an iceberg, but it “feels” like it might) of some very adverse accreditation/deemed status decisions based on insufficient documentation that organizational leadership had been effectively informed of conditions in the physical environment that required additional resources, etc. It’s not that organizational leadership was unaware of the conditions, but more that there was no trail of documented discussion (committee minutes, surveillance rounds, etc.) by which the organization could demonstrate to the surveyors that they had everything under control. In fact, the impression given because of the lack of a documented trail was exactly the opposite.

While nobody is really keen on telling their boss about problems of significance, especially problems for which the means of resolving them are elusive or beyond one’s resources (don’t want to look like you can’t do your job effectively), it is of critical importance to be able to escalate these types of issues to (or near) the top of the organization. Typically, this is about having to fund something (at least in my experience); maybe it’s a roof replacement; maybe it’s replacing some HVAC equipment—I’m sure most folks have a list of things for which it is a struggle to get traction. Let’s face it, unless it’s a new building, facilities infrastructure improvements, safety stuff, etc., is not particularly sexy, so when the capital improvement budgets come and go, it’s a tough sell. But sell it you must and you must keep pushing it—eventually those improvements (or lack thereof) are going to impact patient care and that’s when things can go south in a hurry. We always want to be respectful and not panicky, etc., but, please believe me, when the three- and four-letter regulatory folks knock on the door, you want to be in a position to describe how issues are brought to the attention of leadership. It may not be too pleasant in the moment (okay, in all likelihood, it won’t be pleasant at all), but it can save a whole lot of grief later on.

Next up (and this is something in the way of a commercial), The Joint Commission is hosting a webinar on Tuesday, February 7 to provide information on the new SAFER matrix, which is going to be an important feature of your survey report. We first covered it back in May, but now that they’ve been using it for the past few months (in behavioral health hospitals), it’s possible (I’m hoping likely, but I don’t want to get too amped up) that they will be sharing some useful information from the field. At any rate, particularly for those of you anticipating surveys in the next six to 12 months, I would try to make time for this one. I truly believe that every good intention is put into these survey changes, but I think we can all agree that those good intentions figure very prominently on a certain road…

Finally, this week, I would encourage you to look really, really, really closely at your interim life safety measures (ILSM) policy. TJC conducted a consultant conference last week and it is my understanding that the one significant shift in the survey of the physical environment is that there is going to be a lot of focus on the practical application of ILSMs as a function of Life Safety Code® deficiencies that cannot be immediately corrected. You have to make sure that your policy reflects an ongoing, robust process for that part of the equation. I think the conclusion has been drawn that folks generally have it together when I comes to ILSMs and construction, but are rather less skilled when it comes to those pesky LS deficiencies. We know they tend to focus on areas where they feel there are vulnerabilities (how else might one explain the proliferation of EC/LS/EM findings in recent years). This is a big one folks, so don’t hesitate to dial in with questions.

 

The song changes and yet remains the same…

There was a time when The Joint Commission actually seemed to be encouraging folks to fully engage with the clarification process in all its bountiful goodness. And I certainly hope that folks have been using that process to ensure that they don’t (or didn’t) have to “fix” processes, etc., that might not have been absolutely perfect in execution, but were not, by any stretch of the imagination, broken. But now, it appears that the bounty is going to be somewhat less bountiful as TJC has announced changes to the process, effective January 1, 2017. Please forgive my conspiracy theorist take on this, but it does seem that the new order in the accreditation world appears to lend itself to survey reports that will be increasing in the number of findings, rather than a reduction—and I am shocked! Okay, perhaps “shocked” is a tad hyperbolic. BTW, in a new Advocacy Alert to members, it appears that ASHE has come to the same conclusion, so it’s not just me…hoorah!

And so, the changes:

 

  • Any required documents that are not available at the time of survey will no longer be eligible for the clarification process (basically, the vendor ate my homework). It is important for everyone to have a very clear understanding of what TJC means by “required documents”—there is a list on your organization’s Joint Commission extranet site. My advice, if you have not already done so, is to immediately coordinate the download of that list with your organization’s survey coordinator (or whoever holds the keys to accessing that information—it may even be you!) and start formulating a process for making sure that those documents are maintained in as current a fashion as possible. And make sure your vendors are very, very clear on how much time they have to provide you with the documentation, as well as letting you know ASAP whether you have any deficiencies/discrepancies to manage—that 60-day correction window can close awfully quickly!
  • While I never really liked to employ this strategy, there were times when you could use clerical errors in the survey document to have things removed from the survey report. Areas that were misidentified on the report (non-existent to your facility; not apropos to the cited finding, for example, identification of a rated door or wall where there is none, etc.) or perhaps the location of the finding was so vague as to be impossible to identify—these have all been used successfully, but (apparently) no more. Now whether this means that there will be more in-depth discussions with the survey team as they prepare the report is unknown at this time, but even if one slips by (and I can tell you, the survey reports in general are much more exact—and exacting—in their description of the deficiencies and their locations), it won’t be enough to remove it from the report (though it could make your ESC submittal a bit more challenging if you can’t tell what it is or where it is).
  • The other piece of this is, with the removal of “C” Elements of Performance, you can no longer go the audit route to demonstrate that you were in substantial compliance at the time of survey. So now, effectively, everything is being measured against “perfection” (son of a…); miss one month’s check on a single fire extinguisher and—boom—finding! One rated door that doesn’t latch? Boom—finding! One sprinkler head with dust or a missing escutcheon? Boom—finding! And, as we touched on last week, it’s not just your primary location (aka, “the hospital”) that’s in play—you have got to be able to account for all those pesky little care sites, even the ones for which you are not specifically providing services. Say, for example, the landlord at one of your off-sites is responsible for doing the fire extinguisher checks; if something is missed (and hey, what’s then likelihood of that happening…), then you are vulnerable for a finding. So, unless you are prepared to be absolutely, positively perfect, you’d best be making sure that your organization’s leadership understands that the new survey reality is not likely to be very pretty.

I would like nothing better than to tell you that with the leadership change in Washington there will be a loosening of the regulatory death grip that is today’s reality, but somehow I don’t think that’s gonna happen…