RSSAuthor Archive for Donald A. Butler

Donald A. Butler

Donald A. Butler entered the nursing profession in 1993, and served 11 years with the US Navy Nurse Corps in a wide variety of settings and experiences. Since CDI program implementation in 2006, he has served as the Clinical Documentation Improvement Manager at Vidant Medical Center (an 860 bed tertiary medical center serving the 29 counties of Eastern North Carolina). Searching for better answers or at least questions, Butler says he has the privilege to support an outstanding team of CDI professionals, enjoys interacting with his CDI peers and is blessed with a wonderful family.

Q&A: Measuring CDI program success

Ask a question by posting a reply below.

Ask a question by posting a reply below.

Editor’s Note: This Q&A has been expanded from the initial version which published during CDI Week. You can read Butler’s comments there and visit the archives to read what previous participants have said.

Q: What are the basic metrics that CDI programs should use to measure their success?
A:
To measure this process, I believe there needs to be four fundamental metrics that attempt to focus on what’s required to achieve the fundamental goal of CDI (accurate and complete documentation that is captured in the coding data):

  1. Volume
  2. Activity
  3. Results
  4. Compliance

Volume is simplest to measure. Just look at the number of cases reviewed (there are a couple of broad benchmark sources, the most reasonable to me are in the range of 1,800 to 1,900 cases per CDI specialist per year, with appropriate consideration toward adjustments based upon factors such as staffing, expertise, range of activity, focus of the program, etc.). Remember to define your target population, which cases do your CDI program focusing on, i.e., Medicare, all DRG payers, all payers. Then report what percentage of that target the CDI program is actually able to review and set realistic goals based on your staffing and benchmarked expectations.

The most common way to assess activity is to examine query rate/percentage. There are two manners this is reported. I prefer case query rate (how many cases had at least one query asked). Alternatively, you could divide the total number of queries by the number of cases reviewed.

I also find it helpful to report total query rate along with query rates for specific areas of focus. The generic term I use is ‘impact’ query rate, where impact, or outcome, of the query is defined by the individual program. For example, did the query potentially affect financial, mortality profiling, core measures, etc?

[more]

Superstar stories: Tips for new CDI specialists to be successful

You know that old adage about what happens when the stars align!

You know that old adage about what happens when the stars align!

There’s a lot to do, but you can do it!

Read, research, and constantly learn: Cultivate strong internet search skills along with multiple and varied sources. Build a reference library.

Network and share: Tap into a group of expert resources such as those available on the ACDIS message board CDI Talk. Actively reach out and participate. Reach out locally, regionally, and nationally within the CDI community but reach out to other professions within your own organization too. You will need the help of clinical experts, HIM groups, UR/CM, CDI, quality… Develop a strong partnership with coding.

Think outside the box … w-a-y outside, like over-the-horizon-outside-the-box: Develop creative solutions, messaging, questioning, and learning to strengthen this crazy role.

Remember your first priority. Physician contact, discussion and relations: Be ready to respond quickly to any questions or requests. Seek the chance to help with something of interest (a physician project, data, etc.). Be persistence and patient, especially when working with medical staff. Become a unique knowledge expert within the organization as a result of working at the intersection of clinical, documentation, coding, data, profiling, quality, regulations, etc.

ALWAYS keep sight of the long term goal. NEVER allow a short term event to override the long term goal. Apply your knowledge to look ahead. What is coming from external groups, how can you estimate the impact, and how can the organization respond pro-actively to the future change?

Remember it takes TIME and EFFORT to get up to speed, no matter how much experience and expertise you bring as an RN or HIM professional.

Most importantly, you CAN succeed!!

Editor’s Note: This post is part of our ongoing “CDI Superstar Stories” series which asks members to explain what it takes to be an exemplary CDI specialist. To share advice of your own, email ACDIS Member Services Specialist Penny Richards at prichards@cdiassociation.com and include “Superstar Stories” in the subject line.  ACDIS members can read additional stories in the January 2013 edition of the CDI Journal or type “Superstar” in the search box on this blog.

Tip: Use IPPS data for your CDI program benchmarking

How will you measure your CDI program? The IPPS Final Rule contains some interesting reporting data which may help.

During a recent CDI Talk conversation, I alluded to data available in the CMS IPPS Final Rule that CDI specialists can use to benchmark their progress and compare their efforts against national norms. It may take a little digging, development, and analysis but such effort is worth it.

One drawback, however, is its lack of comparability to hospitals similar to one’s own. Another drawback is that the data represent averages—and who wants to only be average? When you use this data to compare your individual organization’s performance against the national norms, keep in mind that an effective CDI program should likely be above those national benchmark averages. I say this for two reasons: First, many hospitals don’t have any CDI efforts in place and others have meager or ineffective programs so one can suspect the national reporting average to be lower than what an effective CDI program might observe as “average” at its facility.  Secondly, best-in-class is never average. I believe we all want to be effective if not “the best” in our CDI practices. Despite these two drawbacks, the data is free and available to everyone, so why not take advantage of it?

The following analysis is based on the data contained in Table 5, Table 7A, and Table 7B. Table 5 is, of course, the MS-DRG table for fiscal year (FY) 2013. Tthe ICD-9-CM data used in Tables 7A and 7B is from FY 2011. The ICD-9-CM data is then pushed through the v29 grouper (FY12, Table 7A) and the v30 grouper (FY13, Table 7B). Both tables 7A and 7B’s primary purpose is to present data on length of stay (LOS) at different percentiles. It also provides the case volume across the entire Medicare data set for each MS-DRG. I used this table to forecast the possible impacts (assuming no changes in documentation) with the acute renal failure when it changed to a complication/comorbidity (CC) in 2010. (Read that article “CDI analysis can help facilities understand impact of MCC downgrade” on the ACDIS Blog.)

Since we have access to the DRG volumes and the DRG relative weights (RW), from Table 5, we can start to examine frequency, distribution, etc. So, let’s start slicing data. National discharge volumes equal 10,771,161 and the national case-mix index (CMI) equals 1.6045.

Let’s review some background before digging in deeper. Under the IPPS reimbursement system ICD-9-CM diagnosis and procedure codes are grouped into Major Diagnostic Categories (MDCs).  Most of these groupings are by body system; however, there are a few exceptions. The exceptions are the “pre-MDCs,” HIV, and multiple significant trauma. Each MDC is further subdivided into Medicare severity diagnostic related groups (MS-DRGs), which can be classified as medical or surgical based on the presence of an ICD-9-CM procedure code that is classified as requiring additional resources by CMS and, therefore, impacts the DRG payment. The most basic initial MS-DRG organization sorts DRGs into three groups: Medical DRGs, Surgical DRGs, and an odd group called “Pre” which largely consists of the most aggressive cases (transplants, heart machines, ECMO, etc.).

DRGs 3 and 4 are, of course, the rather heavily weighted DRGs for patients that received a trach but don’t have a primary head/neck diagnosis. These are likely the most widely occurring DRGs among the “Pre” group (at many more and smaller hospitals than transplants, etc.).

MS-DRG Type % of total cases
Med

72.2%

Surg

27.8%

Surg Without DRG 3 or 4

27.6%

Surg Without Pre

26.7%

As an aside, note the very small difference in the percentage of total volume when eliminating DRGs 3 and 4: 0.2% of the total cases. Yet, in the table below, pulling out those few cases drives the CMI down for surgical cases by 0.0984 which is a 3.5% decrease.

Whenever I see an unexpected change (either up or down) in CMI, the first place I investigate the cause is in the general med/surg split and then the volumes of DRGs 3 and 4.  Then I look to see if there were any changes in service line volumes due to various possible factors, such as a short term change in physician staffing among certain higher weighted DRGs, a change in facility focus or operational capacity, as well as any significant market changes.

Why DRGs 3 and 4? For a hospital with CMI 2.0 and 1,000 discharges a month, just one less DRG 3 case a month will drive the CMI down 0.002, which is an 0.8% decrease. One case is easily lost in the weeds if your focus is looking at case volumes.

MS-DRG Type

Number of Discharges

CMI
Med

7,777,959

1.1338
Surg

2,993,202

2.8275
Surg without DRG 3 or 4

2,973,568

2.7291
Surg without Pre

2,873,959

2.6329

With such a disparity in CMI between surgical and medical cases, and considering the relatively small slice of all patients that surgical DRGs represent, using the total CMI as a metric for CDI effectiveness might be considered fraught with risk.

At first glance here are the specific types of the DRGs as far as influence of secondary diagnosis on the DRG assignment:

  All Med Surg
None 9% 11% 3%
Pair MCC 33% 34% 32%
Pair CC or MCC 3% 1% 8%
Triplet 55% 54% 57%

It is interesting to see where the volume variations between medicine and surgery are, specifically in the column for “None,” such as chest pain, TIA, and syncope, and the column for “Pair CC or MCC.” The volumes between the MCC pair and the triplet are similar for both medicine and surgical DRGs. However, as one can see below, the overall capture of secondary diagnosis is rather different between the medical and surgical DRGs. [more]

Crossing CDI program boundaries

What will it take to push your program beyond its artificial boundaries?

What new boundaries are CDI professionals exploring? CDI specialists discussed several areas of expansion during the 2011 CDI Week celebrations last September. You can read about them in the special CDI Week Q&As and in the CDI Week Industry Survey, which are still available on the ACDIS website. CDI professionals also frequently explore the boundaries of the CDI profession on the ACDIS Blog and on CDI Talk discussion strings.

And I know that those fortunate enough to attend the ACDIS conference in San Diego next week will certainly learn about new documentation improvement opportunities. Come to think of it, the conference has such good ideas every year—and a good idea doesn’t truly get stale—you should take a look back at conference materials from previous events to see what tips you may find and consider implementing.

Conversations regarding CDI expansion really should be considered aspects of program and organizational strategic planning. CDI managers need to consider where CDI specialists will focus their primary efforts over the next year, two years, even five years.

Yes, the regulatory environment governing healthcare is always changing and most CDI program directors can guess about how those regulatory changes will affect CDI, patient care, and the healthcare revenue cycle. But well-informed professionals can make some practical suggestions to position their CDI team appropriately for the future.

Warning, what follows is somewhat like throwing pasta against a wall—some ideas may simply fall and other ideas, like a good al dente macaroni will stick. Regardless, here are my thoughts about possible avenues for CDI program expansion.

CDI specialists should consider conducting record reviews for:

  • Mortality/quality/length of stay/severity of illness profiling
  • Surgical complications
  • Hospital acquired and present on admission conditions
  • Medical necessity support (both initial and ongoing stay)
  • Evaluation and management documentation

Additionally, CDI programs may gain ground by exploring:

  • Medicaid, third-party, private payer initiatives
  • Outpatient CDI (e.g., emergency department, ambulatory, denials management)
  • Documentation improvement opportunities in alternative settings such as long-term care, rehabilitation, psych, pediatric, and obstetrics units (ACDIS recently launched a new networking group dubbed APDIS-the Association for Pediatric Documentation Improvement Specialists)
  • New government initiatives such as Value-Based Purchasing, Accountable Care Organizations, and payment bundling
  • Proactive Recovery Auditor and external auditor defense
  • Collaboration in development of clinical best practice, documentation, protocols, etc.
  • Data mining and reporting (internal drivers and external reports)
  • ‘Hardwire’ documentation improvement elements in EMR and IT systems
  • Quality data versus coded data
    • Why and where does a difference exist?
    • What can be done to ensure both data sets are parallel and completely accurate?
    • How can CDI contribute to clinical care and quality data measurements?

Of course, a number of previous posts directly or indirectly address exploring new CDI areas. As you investigate new ideas, try new things out, consider sharing with your professional colleagues—comment on CDI Talk, write a blog post, contribute a CDI Strategies quick note, or partner with other staff to write a CDI Journal article.

CDI specialist orientation (more CDI Talk inspiration)

One of the repeated conversation themes on CDI Talk is how to orient a new staff member (within an existing program), or how a small program can start its own CDI efforts and train its own staff. Parallel to those conversation threads are participants’ real hunger for more avenues and sources of education.

Let’s look at some of ACDIS’ online poll data to set the stage:

  • July 2011: How many total years of professional experience do you have in healthcare (CDI, plus other)?
    • 20 years or more, 60%
  • November 2009: How long did it take you to get up to speed as a new CDI specialist?
    • 3 to 6 months, 32%
    • 6 to 12 months, 34%
  • June 2011: How long do you think it takes to achieve an “expert” level of proficiency as a CDI specialist?
    • 2 years, 35%
    • 3 years, 22%.

And here’s one  final on-line poll data point to help me answer the question as to whether CDI managers are actually providing enough training to new staff members:

  • January 2011: How long is your training period for new CDI specialists?
    • 12%, 2 weeks
    • 22%, 30 days
    • 30%, 31 to 60 days
    • 20%, 61 to 120 days
    • 12%, approximately 6 months
    • 3%, less than 6 months

It seems to me that those who indicated that it takes six months or more to get up to speed need more training than what I commonly consider necessary as part of orientation.  This data suggests that what is these new CDI specialists need is more of a mini-college training program.

Obviously there is a rather significant challenge—how to provide the level of knowledge and training along with the

(Image via Homeclick) It is a sink that is made so fish swim in it. Get it? Sink or swim.

 

appropriate mentoring to actively promote and support the new CDI specialists so they can succeed. Of course, there is always the consultant option which proves to be relatively expensive. Plus, a ‘mature’ program should not need to rely on such an expensive option for new staff orientations. At the opposite end of the spectrum is the ‘sink or swim’ method.

Thankfully, home grown and self-supported possibilities exist to constitute a middle ground between these two options. At the very least, facilities should implement an orientation or mentoring process where the experienced individual’s guidance can make a huge impact.

I believe the biggest challenge facing those hoping to implement a CDI orientation program comes from a lack of targeted, written learning resources. I consider one of the largest draws for ACDIS membership stems from the need for learning, resources, and accessibility to a community of knowledgeable and supportive peers. ACDIS provides such a community, with a quickly growing resource base. (If you’re a member, you ought to know. If not, go look at every part of the ACDIS home page).

In addition, ACDIS offers a few helpful handbooks and guides that can be re-purposed for orientation, such as:

Furthermore, the only independent (i.e., not part of a consulting package) seminar I’ve found is HCPro’s CDI Boot Camp. While the total cost (fee, travel, hotel) may be prohibitive for many there is also the online version as an option. Again, a mature CDI program ought to be able to handle at least some of the orientation process internally.

Even with the valuable resources of ACDIS, some holes in new staff orientation remain. AHIMA and AHA’s Coding Clinic for ICD-9-CM provide further guidance, but even those resources do not cover everything. Several major elements of an orientation program are not addressed by the resources mentioned.  Just to get started, how about:

  • Creating a tool that outlines in detail basic competency and knowledge expectations for the novice CDI specialist. This tool should also list areas for mid-level and advanced achievements to give new CDI staff a set of expectations for continued professional growth. There are some examples in the Forms & Tools Library, in the policies and procedures section (search for “staff orientation checklist), but not at the detail I envision.
  • Curating a collection of vital subject articles and references. (Review the CDI Journal archives, the ACDIS Blog, and the Helpful Resources links just to get started on this collection. Add in other professional organizations and their publications such as the National Institutes of Health, AHIMA, AMA, and others and this would be a one-stop database of useful CDI knowledge.)
  • Creating an outline of topics that the new CDI specialist needs to master before achieving their initial competency. Further, this outline ought to provide enough detail and referenced sources to serve as a complete training program guide.
    • Sources would likely include the books and articles mentioned immediately above, along with sections of widely accepted texts such as coding guidelines, Faye Brown, and medicine texts like the Merck Manual.

Before starting to collect all those articles and tools, though, I should probably determine the basic elements of an orientation program! Below I’ve listed a few resources online which discuss this, including:

After reviewing these, I must confess that my definition of orientation varies from those discussed above.  Still, several points are important to keep in mind to successfully bring a new staff member up to speed in the CDI world:

  • Provide structured, purposeful training
  • Offer a straightforward sequence of topics or activities to enable learning
  • Give new staff members a written agenda complete with goals and measurable objectives
  • Provide ongoing, two-way feedback and evaluation
  • Supply appropriate resources and support
  • Actively integrate the new person into the team
  • Celebrate and welcome the individual and his/her accomplishments as they gain proficiency in their new role
  • Pair new staff with an experienced mentor and provide oversight of their engagement
  • Offer engaging, interactive, as well as some self-directed education

However, as mature and professional learners, CDI specialists must be responsible and accountable for their education and success.

Honestly, for a new or developing program that has to add or replace staff, the right consultant is worth the money.

At some point CDI programs need to be able to hire new staff and train them in-house. Creating a comprehensive training program does require a lot of effort and maybe it is work that some of you have already done?  If so, why duplicate work? Let’s see if we can compile a  “best of” list of what program components others have found successful and create a tool that we can share. Post your information here to the blog, e-mail me, or contact Associate Director Melissa Varnavas mvarnavas@cdiassociation.com

Reflections on physician leadership and engagement with CDI programs

Over the past several years there have been a number of conversations that touch on physician leadership involvement with CDI. Programs can and do achieve success, but so much more is achieved when there is a proactive and supportive medical voice.

Physician leadership can come from a number of sources and in a variety of forms. Some CDI programs (a few anyway) report directly or indirectly to a physician executive (medical staff functions, chief medical officer [CMO], etc.) and other programs report to the quality department where a physician executive is frequently directly involved. In these circumstances, I hope the physician executive maintains some amount of time dedicated for CDI efforts.

Some organizations are fortunate enough to have physician leadership within the broader organization that is (or have been convinced to be) very supportive to CDI efforts. From what I’ve heard, these frequently include CMOs and chiefs of staff and/or service lines within a given facility. Finally, some physicians, such as a medical director, physician champion, advisor, or liaison, devote a portion of their time to work directly with CDI. (Read more about the expanding roles and responsibilities of CDI physician advisors in the January 2012 edition of the CDI Journal.)

Furthermore, even with supportive medical staff leadership, how that support translates into action varies. Some facilities provide physicians time to offer educational sessions to their CDI and coding teams. Others provide CDI education sessions to entire physician groups by service line.

Most CDI programs earn physician leadership and support through the tireless efforts of the CDI staff and program leaders. Only occasionally have I seen this support present from the very beginning.

Some Perspectives

I’d like to look at the “state of affairs” in regards to physician leadership.  One ACDIS weekly online poll (2008) addressed the simple question of whether respondents had a “physician champion” and if that champion was effective. That poll was rather surprising; only 46% indicated they had a physician champion, and half of the respondents with a physician champion actually rated him/her as ineffective. So, according to that poll, only 23% of programs have an effective physician advisor.

ACDIS repeated the  poll (with slightly different wording) in April 2011 and though the results showed some improvement, they were still discouraging. In 2011, 31% described having a very beneficial physician champion, 22% described their physician champion as “’minimally effective”, 24% felt the position was not affordable, and 16% indicated that their program could not find a good candidate. Even more surprisingly to me, 7% said they simply did not see the need for the roll.

Additional polls from 2008 which echo the theme of limited physician support for CDI programs include:

Other recent poll responses illustrate different aspects of physician involvement in CDI , but I thought these painted an interesting picture.

Don’t forget the most recent study, published in the January CDI Journal, in which 73% (178 individuals) indicated that their physician advisor spends five hours or less dedicated to CDI efforts, and 54% described their advisor as either moderately effective or ineffective.

Data

I think it is  important to have data to effectively measure any focus area of interest. I believe a couple of key metric data pieces provide insight to the level of success with physician engagement. In any analysis, I would include items such as:

  • Physician response rates
  • Severity of illness (SOI)/risk of mortality (ROM) data
  • Trends in volume of queries and more specifically the focus of queries (Do CDI staff ask the same queries repeatedly?)

I specifically would not include physician agreement rate except in a broader sense in looking for individual outlier physicians, to find those who either agree to whatever the CDI specialist asks or those who never agree with the premise of a CDI specialist’s query.

As always, I’d love to hear what elements other CDI programs use to statistically validate their physicians’ involvement with and support of their CDI programs.

Resources

Quite a bit of material is available between the ACDIS online polls (I have fun with those, obviously), various blog postings, journal articles, and conference presentations that offer useful information regarding physician engagement. Several provide inspiring examples of successes. Various items from other organizations are in the public domain.

If you are interested, shoot me an e-mail or leave a comment here and I can develop a partial list of links.

Wrap -up

I am sure most agree that fostering physician engagement in CDI efforts is one of the key challenges of every CDI program.

I certainly don’t have many great answers to this question, and I’d like to hear more thoughts, experiences, and success stories. I know some great examples would be wonderful Journal articles or blog posts.

I will toss in a final thought. Organizational cultural change typically takes five years. Certainly obtaining physician interest in documentation and coded data represents a significant cultural change.

Sometimes I wonder if just need to practice a little more persistence and a lot more patience.

Thoughts on evaluating vendors/consultants

How can you tell which vendor/consultant will stand out from the crowd? A little self-preparation and planning can make a big difference.

A CDI Talk string discussed ideas for evaluating consultants—a conversation that didn’t gather much steam. So I thought I would throw some ideas out to CDI “blog-o-sphere” to discover what floats.

When considering whether to enter into a contract with a vendor or consulting firm, first determine if this particular entity’s culture and philosophy “matches” that of your own organization. Second, determine what services, products, and/or deliverables you need the consultant to provide. Third, do some legwork to obtain both direct and indirect referrals regarding the consultant’s performance. Be sure to contact referrals provided by the consultant and also gather others on your own through networking or by contacting facilities of similar size and make-up as yours.

Compatibility

You will likely be able to answer the first question, whether your philosophy the consulting firm’s philosophy match, only after several interactions between yourself and representatives from the consulting company, and by combining a number of sources. From my viewpoint, it is a crucial question but ultimately one that each program/organization really needs to answer for itself.

Of course, understanding whether your program is compatible with the philosophy/goals of the consultant also depends on whether you and your program staff have a solid understanding of trends in the CDI profession. Your CDI team and manager must possess a working knowledge of commonly accepted best practices and overall trends of CDI practice from a national perspective.

Frequently, facility leaders hire a firm to provide their facility with that level of insight. Of course, consultants travel the country working with diverse program types and sizes and can provide such overarching global perspective. However, comparing their perspective against your own awareness can foster interesting dialogues regarding program goals and parameters. From that position of self- and industry-knowledge, you (as the hiring agent) can consciously develop your own thoughts and positions and compare them against the particular consulting firm’s’s vision. Hmmm, already quite a bit of hard thought and work!

Products and services

Evaluate your CDI program and its interactions with other, related departments, to determine exactly what product and services you need.

Perhaps when you attended last year’s ACDIS National Conference, you saw a great demonstration of a new query software package. After further discussion, management at your facility wants to move forward with discussions regarding it and other similar products. As it turns out the vendor or consulting firm you remember from the conference not only provides a CDI-related query tool but other products and services as well. What should you do?

Again be aware of the scope and particulars of your facility’s needs. Then, ask different vendors a lot of questions. Keep a spreadsheet of the questions you’ve asked, how each vendor responded, and which vendor representative answered which question on what date.

Determine what tools or products will be provided (and whether the contract include periodic updates to these) such as:

  • Hardware, software, or system integration
  • Sample queries, policies, and procedures
  • Physician education pocket cards, handouts, PowerPoint presentations, educational materials
  • Reference books and additional education
  • On-call services
  • Periodic auditing
  • Regular systems analysis and reporting
  • Ongoing staff training

Once you’ve worked out the details of the services and products the consulting firm will provide, dig a little deeper to determine the scope and range of education the vendor will provide to your staff. After all it does no one any good to have a great gadget if no one knows how to use it. Ask:

  • Will the vendor provide training to coders, CDI staff, physicians, and upper management individually?
  • Will it tailor sessions to the needs of individual groups/departments?
  • Will it guarantee the qualifications and expertise of its instructors? If your staff desires the specific qualifications of a particular individual to train your physicians be sure to include that in the contract.
  • Will the organization gain the rights to the education materials or does the consulting firm retain them? Once the contract concludes will the facility need to develop its own educational program or query forms)?
  • What forms of ongoing support and education will the vendor offer in the first few months and years?

Additionally, create a plan to evaluate the vendor’s effectiveness. Develop an implementation timeline and set reasonable deadlines for deliverables.

References

As I mentioned earlier, vendors/consultants should freely offer names of referrals of people and programs with which they have a positive working relationship. Be aware, of course, that these sources will most likely express glowing recommendations of the company with whom they are contracted. That’s to be expected. Everyone wants to put their best face forward. You wouldn’t tell a prospective employer to call an old boss with whom you’ve had an unresolved dispute, would you? Therefore, be sure to target your questions to these references to get the most well-rounded description of their experience. Ask:

  • What was the highlight of your experience with this consultant/vendor?
  • What part of the process would you do over differently? Why? And be sure to ask the individual how he or she would change his or her own behavior. What parts of the process do you wish the consultant/vendor would have handled differently? Why?

Armed with information and background from this exchange, solicit feedback from other peers to augment your research with an unbiased assessment. You may struggle to find a similar facility that also has experience with a given vendor. Use the networking available via ACDIS (e.g., local chapter members and meetings, ACDIS social media, and message boards). Also reach out to nearby facilities to see if they’d host you for an afternoon to give you a better picture of how they use the vendor/consultant product and services on a daily basis. These questions will likely be helpful even if your neighbors and peers have experience with different consultants. You can certainly learn broader lessons. Be sure to ask:

  • How long have you worked with this vendor/consultant?
  • Can you rate your overall experience on a scale of 1-10 (10 being the best)?
  • What duties did the vendor/consultant include in your contract?
  • What did the vendor/consultant do to understand your organization’s dynamics, culture, unique factors, etc.? What adjustments did it make to its proposal in light of that understanding?
  • What was its strengths?
  • What was its weaknesses?
  • What tools did the vendor/consultant provide and how would you evaluate those tools (e.g., electronic programming, handouts and reference materials, educational sessions, manuals)?
  • What level of ongoing support does the vendor/consultant provide?
  • What type of data analysis or reporting did the vendor/consultant provide?
  • What were the tangible and concrete results of the engagement?
  • If you were to start over again, what aspects of the project or of the consultant engagement would you handle differently? Why?
  • Going forward, what would you ask the vendor/consultant to change, add, stop, or maintain?

Final thoughts

Good consultants/vendors are experts who bring value to your organization. Don’t hesitate to ask questions, to tailor their offerings to your facility’s needs, or to try something a bit different, something your facility thinks might work better locally. Similarly, don’t hesitate to decide not to follow a particular piece of advice. Of course, proceed carefully, thoughtfully, and with discussion, but still, don’t hesitate to try!

At the end of the day, it is up to you and your organization to carry your CDI program forward so you need to be clear on what, why, and how you and your team plans to do things and be comfortable with that.

Thankful for CDI community collaboration

Over this past year, I’ve had the tremendous honor and pleasure to engage in substantive conversations with at

Everyone has something to say, what can you learn by joining the conversation?

least three organizations. This has actually been a humbling experience for me — that ‘someone’ out there felt strongly enough of my knowledge, ability, experience, and/or writings that they sought my ideas on CDI. I know I learned a lot through the process of reflection and discussion that occurred. I feel I gained so much more than I offered.

This ‘jazzed’ feeling I experienced during those conversations is the same that I’ve felt every time I’ve been able to attend a gathering of CDI professionals, every time I’ve had the opportunity to speak and teach about CDI or documentation, every time I’ve had a reflective exchange on CDI Talk, or every time I’ve enjoyed any sort of stimulating conversation.

Seems to me, these opportunities I’ve enjoyed are part of the broad concept of networking and collegial professional relationships. This is one of the strongest characteristics I feel we possess as a nascent profession — collaboration, mutual support, and exchange.

This is an important avenue for us as we advance our professionalism.

I am deeply grateful that I’ve had a variety of such experiences. I hope that many others have had the honor to feel this excited about (and due to) our CDI profession. Equally, I wish for everyone a coming year filled with professional satisfaction and fulfillment.

Social Media: Untangling the webs

Don't let participation in the World Wide Web get you tangled; simply practice professional discourse.

There are risks to activity on social media. For example, consider the case of a student doing a senior seminar experience (these can be an avenue to one’s first professional job) at a facility.  Unfortunately for this individual, before the facility moved to hire her, her potential employer went and looked at her Facebook page. The result: She was immediately no longer considered for any open position. I’ll leave the details out.

What about Google?  Have you ever Googled yourself, a potential date, an applicant, or a professional service you’re evaluating? Even out of pure curiosity?

Reviewing an individual’s social media presence and activity is a legitimate tool for potential employers and current managers.

I believe there ought to be some degree of latitude with regard to purely personal social activity, of course. Sites like Facebook were built with an aspect of entertainment and socialization imbedded. But the bottom line is that any activity on the internet is captured; potentially stored forever.

The same is true for the ACDIS Blog, CDI Talk, and related ACDIS social media venues.  Even though CDI Talk is for members, the network of membership is at least a couple thousand, and everyone has friends… or colleagues…

Though several recent unfortunate examples come to mind, I’ll only share my own personal experience.

Not too long ago I replied to a CDI Talk discussion but as soon as I hit send, I saw the context of what I’d written in a different, less appealing, light. I called the author of the original post and apologized. I was relieved to learn that this individual had not taken my response in the negative vein as I’d feared but in the positive light in which it was intended.

The networking avenues afforded to us individually through our ACDIS membership are intended to allow us to air our concerns and express our frustrations, to allow us to network with each other and to learn from each other.

However, this network is open to all members of ACDIS—the peers who share your opinions and those who may not; the friends you may have made at a nearby facility and colleagues who work alongside you at your current hospital.

You may find your words and thoughts in the hands of your current boss, or being considered 10 years from now when you are seeking new opportunities.

Don’t get the wrong idea; there is a genuine upside to participating in the social media networks that are now open to our profession. I love participating and conversing with my colleagues. My activity on the blog and on CDI Talk has already afforded me with great opportunities to meet people, learn much, and bring it all back “home” to improve our own CDI program. Despite the potential pitfalls of participation I do not plan to stop what I am doing. In fact, I wish there were more participants.

However, I often do wait before I hit send. I reread my statements and ponder how the message might be perceived. I question if I am revealing too much. I wonder if those reading my response could be offended in anyway. That isn’t to say I stifle my opinion. I just want to be sure to keep my professional integrity intact. Please do join us in the electronic networking that’s offered.

Just be careful out there.

CDI Productivity Benchmarks (A CDI Talk topic)

ACDIS surveys, reports, and polls, can help answer productivity questions.

There was an excellent conversation string started on CDI Talk a couple of days ago about productivity measures and staffing models. I provided one of my typical responses there and realized that it might be worth developing into a quick, short(er) post.

The original question asked about daily expectations for an individual CDI specialists as far as initiating new cases and following up on existing cases, as well as expectations for reviews per number of discharges per year.

A lot of excellent replies, comments and sharing followed.  I do shy away from quoting any specific response (you know just like Vegas, what happens on CDI Talk stays on CDI Talk), but one of the repeated observations was how difficult it is to come up with a single figure of merit due to a number of program variations such as:

  • number of individuals
  • program focus
  • paper vs electronic record (and which electronic record system a facility uses)
  • physician collaboration
  • CDI staff experience level and learning curve
  • additional roles/focus (ROM/SOI, POA, RAC, core measures, etc.)
  • complexity of patients

There are few (if any) true benchmarking resources that I have found outside of ACDIS.  Consultants certainly have their own models but that is not the same as an objective “what is being achieved”.  All three of the following are worth reading carefully.

Let me briefly summarize some of the ACDIS survey data.  I will use the 2010 Physician Query Benchmarking Report, though the other sources generally agree.

Items that influence productivity:

  • Frequency of concurrent review: 58% daily, 24% every other day.
  • Majority of queries: 63% written paper based, 20% written electronic, 3% verbal, 12% equal mix written and verbal
  • Do you query when there is not a financial impact: 43% always, 44% frequently
  • Do you use templates for written queries: 31% always, 36% frequently, 16% sometimes, 13% never.

Direct productivity benchmark measures:

  • Do your CD I specialist’s have a set query quota to meet: 56% no, 38% yes (the median point for that query quota appears almost 25% queries).
  • Median query rate about 18%
  • Median physician response 87% (with a clear break for >70% suggesting an absolute minimum)
  • Median Physician agreement 88% (again, >70%)
  • Median new charts per day of 12 (majority between 6 & 25)
  • Median repeat reviews per day of 12 (most 6 to 20)

Most sources suggest an average combined total of charts reviewed around 25 charts.  Unfortunately, when extrapolating the daily numbers, they don’t match up with what is commonly discussed for an annual productivity model broadly between 1,300 and 1,900 (i.e., 20 to 25 working days a month times 12 new reviews daily times 12 months gets you to >2,400 cases a year).