The Unknown Cost of “High Quality Outcomes” in Healthcare

“You were recently acknowledged for having high quality outcomes compared to your peers, how much is it costing you to report this information?”

I recently read an article on healthcareitnews.com, “What Makes a High Performing Hospital? Ask Premier”. Because so many healthcare providers are so quick to tout their “quality credentials” (yet very few understand how much it costs their organization in wasted time and money running around to collect the data to make these claims) and this article sparked the following thoughts…

The easiest way to describe it, I’ve been told after many times trying to describe it myself, is “the tip of the iceberg”. That is the best analogy to give a group of patient safety and quality executives, staffers, and analysts when describing the effort, patience, time and money needed to build a “patient safety and quality dashboard”  with all types of quality measures with different forms of drill down and roll up.

What most patient safety and quality folks want is a sexy dashboard or scorecard  that can help them report and analyze, in a single place and tool, all of their patient safety and quality measures. It has dials and colors and all sorts of bells and whistles. From Press Ganey patient satisfaction scores, to AHRQ PSIs, Thomson Reuters and Quantros Core Measures, TheraDoc and Midas infection control measures, UHC Academic Medical Center measures….you name it. They want one place to go to see this information aggregated at the enterprise level, with the ability to drill down to the patient detail. They want to see it by Location, or by Physician, by Service Line or by Procedure/Diagnosis. This can be very helpful and extremely valuable to organizations that continue to waste money on quality analysts and abstractors who simply “collect data” instead of “analyze and act” on it. How much time do you think your PS&Q people spend finding data and plugging away at spreadsheets? How much time is left for actual value-added analysis? I would bet you very little…

So that’s what they want, but what are they willing to pay for? The answer is very little. Why?

People in patient safety and quality are experts…in patient safety and quality. What they’re not experts in is data integration, enterprise information management, meta-data strategy, data quality, ETL, data storage, database design, and so on. Why do I mention all these technical principles? Because they ALL go into a robust, comprehensive, scalable and extensible data integration strategy…which sits underneath that sexy dashboard you think you want. So, it is easy for providers to be attracted to someone offering a “sexy dashboard” that knows diddly squat about the foundation, or what you can’t see under the water, that’s required to build it. Didn’t anyone ever tell you “if it sounds too good to be true, it is!?”

The Struggle to Define Quality Measures: Do You Have the Right People with the Right Skill Set Supporting This Effort?

Standardizing the definition of quality measures is hard enough when you have the right people. Unfortunately, it is too often the case that hospitals are not armed with the right people and skills sets to address this costly, complicated issue.

Over the past 2 years, we’ve heard a lot about the shortage of primary care physicians in this country, mostly due to the public debate about how to reform healthcare. What we haven’t heard nearly enough about is the even larger shortage of clinical analysts and informaticists. I would argue that right now, hospitals and healthcare organizations need this skill set more than almost anything else. Go to any large hospitals’ website and I’d be willing to bet there is a job posting related to these roles. Here’s why.

How many times has your healthcare IT or data related projects failed because of these two reasons (that I hear almost once a week)?

  • [IT Perspective] – “the users can’t tell us how they want to use the system, how they want to see the data, what they need out of their clinical applications…they don’t know how to ask the right questions!”
  • [Clinical Perspective] – “our people in IT don’t know the clinical world at all. Things aren’t as cut and dry as they try and make it. It’s not 0 or 1, or Yes or No – it’s more complicated than that. I wish they could just live a day in my life and see how I operate, things would be so much easier!”

And there you have it. The conundrum that almost every hospital deals with – an inefficient, ineffective relationship between their clinical users and supporting IT department/clinical decision support (CDS). I wrote previously about the difficulties IT Projects at hospitals face when the clinical and technical stakeholders don’t even know each other. “Dr. meet IT; IT meet Dr.” What I haven’t touched on, though, is the importance of what I like to call the “translators” that every hospital needs. These folks are the Clinical Systems Analysts, Clinical Decision Support Analysts, and Healthcare Informaticists who have a clinical education and real world experience with workflows and processes, but also have a strong understanding of information technology, clinical applications, and most importantly, the data. These resources are invaluable to institutions that finally understand this fundamental principle: the fastest, easiest way to improving patient outcomes and reducing the cost of delivering care is understanding ways to identify best practices and underperformers within your organization through the use of advanced analytics. How do you do that? You have someone who understands the data and can help directors and managers or clinical units/care settings understand where there are opportunities for improvement. It is essential these people “talk the clinical talk” when discussing data trends with nurses and clinicians; and “talk the IT talk” when relaying requirements and system improvements to the IT and CDS teams.

Without resources who can “straddle the fence” that sits between clinical users and CDS staff members, you’ll continue to have a disconnect between the people collecting the data and those trying to understand and report it. It’s time to find people who can play in both worlds. It’s not rocket science…even if calculating CMS Core Measures is.

The Struggle to Define Quality Measures: Try Finding the “True Source” of Your Data First

Let’s look at one measure that is extremely important to hospitals right now: How do you calculate Patient Re-Admission rates?

  • Do you break it down by certain characteristics of a Re-Admission like where the patient came from (“Admission Source”)?
  • Do you make sure to exclude any patients that expired (“Discharge Disposition” = expired) so your numbers aren’t inflated?
  • Do you include certain patient types and exclude others (include “Patient Type” = Inpatient or Psych; exclude “Patient Type” = Rehab)?
  • Do you exclude outliers like the patient that has been in your hospital for what seems like forever but goes back and forth from home health and nursing clinics to the hospital?

Ok, let’s say you do take all these nuances into consideration when calculating your Re-Admission rate……now let me ask you this — what source are you using to collect these various data elements?

  • Your ADT / registration system?
  • Your billing system?
  • How about the patient tracking system
  •  Or the case management system?
  • How about your brand new shiny EMR?

The scenario I describe above is very real. It is highly likely that the data elements needed to calculate Re-Admission rates are scattered in multiple systems across the enterprise and to make matters worse, the same data elements like “Discharge Disposition” and “Admission Source” can be found in multiple systems! Now extrapolate this problem over nearly ALL of your quality measures and you have the conundrum that Patient Safety and Quality departments struggle with in every hospital across the country. I consistently hear:

  • “How do I know everyone is calculating infection rates [CLABSIs, VAPs, CAUTIs] the same way?”
  • “How can I be sure we’re identifying the appropriate Pressure Ulcers stages consistently across units?”
  • “How can I standardize the collection and reporting of NDNQI? PQRI? AHRQ PSIs? Across my units and entities (for multi-facility organizations)?”

The answer to these questions often starts with knowing where the most reliable source of the individual data elements needed to calculate these measures sit. It gets trickier though because even when you find the best data source, you then have to be sure that the data is being entered consistently across your user community. That means:

  • Is the data being entered at the same time?
  • Is each data field restricted to certain values? Are the users entering these values or free-text?
  • Is the data being entered by the same person / role? With the same level of experience and expertise (this is especially critical in the case of identifying infections and pressure ulcers)?
  • Is the data being entered manually? Is it entered in multiple places?
  • Are there spreadsheets and documents that have this data that are paper sources and therefore, not able to be automatically data-mined?
  • Do your users understand the importance of standardizing this process? Do they understand the value it provides their organization and thus their patients?

The answer to the last question often eludes many clinicians I run into. It is sometimes difficult for someone who has been clinically trained their entire career to understand the power of discrete data over free-text narrative documentation.

One exercise we have found extremely helpful for our clients is creating a source-to-measure mapping document that identifies the agreed upon sources of the individual data elements (both numerator and denominators) for each quality measure being reported across the enterprise. Once this is created, get your clinicians, nursing, and analysts to bless it and finally publish it for reference so there is no ambiguity moving forward. Now everyone is reporting the same data with the same definitions. The most difficult part, though, comes as you have to hold people accountable for changing their practices to improve patient outcomes once you’re all reporting in the same language.

Strategic Finance for Service Lines: Finding Opportunities for Growth

Healthcare providers are always seeking innovations and evaluating strategic alternatives to meet growing demand while healthcare legislation is adding challenges to an already complex industry. As the population continues to age and development increases the demand for high quality healthcare, providers must put themselves in the optimal financial position to deliver the best care to the communities that depend on them.

To do this, many are turning to a service line model so that they can identify profitable areas of their organization that will generate future growth and capture market share.  In order to identify the strategic value of each service line, organizations need to have a long-range planning tool that will enable them to quickly forecast each of their service lines over the next 3-5 years and evaluate growth areas so that investments can be made in the service lines that will generate the greatest long-term economic value for the organization.

Utilizing Oracle’s Hyperion Strategic Finance, Edgewater Ranzal has helped many organizations chart a realistic financial plan to achieve their long-range goals and vision.  Some of the ways that we have helped organizations are as follows:

  • Forecast detailed P&Ls  for each service line using revenue and cost drivers such as number of patients, revenue per procedure, FTE’s, and payer mix to accurately forecast profit levels of each service line.
  • Easily consolidate the forecasted service line P&Ls to view the expected financial results at a care center level or for the Healthcare organization as a whole.
  • Layer into the consolidation structure potential new service lines that are being evaluated to understand the incremental financial impact of adding this new service line.
  • Run scenarios on the key business drivers of each service line to understand how sensitive profitability, EPS, and other key metrics are to changes in variables like number of patients, payer mix, FTE’s and salary levels.
  • Compare multiple scenarios side by side to evaluate the risks and benefits of specific strategies.
  • Evaluate the economic value of large capital projects needed to grow specific service lines beyond their current capacity.  Compare the NPV and IRR of various projects to determine which ones should be funded.
  • Layer into the consolidation structure specific capital projects and view their incremental impact on revenue growth and profitability at the service line level as well as the healthcare organization as a whole.
  • Use the built in funding routine of HSF to allocate cash surpluses to new investments and to analyze at what point in time the organization is going to need to secure more debt financing to fund its operations and its capital investments in specific service lines.

Regardless of where you are in your understanding, analysis, or implementation of service lines, a viable long-term strategy must include a critical evaluation of how will you identify the market drivers for growth, measure sustainable financial success, and adjust to changing economic, regulatory, and financial conditions.

So You Think an Accountable Care Organization (ACO) is a Good Idea – First Things First, What’s your Data Look Like?

I will not pretend to know more about Accountable Care Organizations (ACOs) than Dr. Elliot Fisher of Dartmouth Medical School who is the unofficial founder of the ideas behind this new model for healthcare reform. But if I may, I’d like to leverage some of his ideas to outline the necessary first step for creating a new ACO or being included in one of the many that already exist.

First and foremost, I understand that there are very smart people out there who insist ACOs are a bad idea (click here and here to read strong arguments against ACOs). Having said that, there are fundamental aspects of ACOs that are prerequisites for success; one of these is the ability to calculate mandated performance metrics and share this data electronically with other members of the ACO. How else are they going to know if it’s working (and by working I mean lowering costs while improving the quality of care)?

Healthcare providers are used to having to calculate quality metrics, like Core Measures amongst others, for the purposes of demonstrating high quality care, being compliant with regulator mandates, and satisfying their patient populations. What they’re not used to is having to report in a timely fashion. Hospitals routinely, for instance, report CMS core measures months after the patient encounters actually occurred. The previous 3 clients I worked with reported March/April/May measures in August. The Senate Bill that allows for CMS to contract with ACOs specifies the need to share performance metrics among participating entities but leaves the how and how often to each ACO to decide. This could be extremely problematic considering the huge discrepancy across our provider networks of the necessary healthcare IT infrastructure to gather, calculate, and report these metrics across care settings in a timely manner.

The first thing to consider when contemplating participation in an ACO is, “How robust is your data infrastructure and can you meet the reporting requirements mandated for any ACO participation?” Dr. Fisher points out, “We have been collaborating withCMS, private health plans, and medical systems to identify and support the technical infrastructure needed to deliver timely, consistent performance information to ACOs and other providers.” If you think your paper-chasing and manual chart abstraction that gets you by today for most reporting requirements will fly, think again. An ACO will only be as strong as its weakest link. A successful ACO is able to monitor its progress against the benchmarks established for total cost for delivering healthcare services per enrollee. The overall goal is to lower the cost to provide services while maintaining a high level of quality, and subsequently share the cost savings. There are other similar models such as the Alternative Quality Contracts (AQCs) currently rolled out by BCBSMA, with similar criteria and financial incentives. In both cases, though, the fundamental data infrastructure is required to meet the stringent reporting requirements. In addition, as ACOs gain traction and identify new ways to lower the cost of providing care, the need for a robust reporting infrastructure to eliminate the tremendous amount of time and money spent on collecting, calculating, reporting and distributing information including quality, operational, clinical, and financial metrics becomes even more instrumental.  The best case scenario also includes an evolution to healthcare analytics when analysis of data spans care settings, siloed applications, and even facilities (Chet Speed, the American Medical Group Associations VP, Public Policy agreed with me on this point in his recent interview with Healthcare Informatics). But first things first.

You can do a lot to improve your chances of success within an ACO; start with understanding the requirements for sharing discrete, accurate, consistent data, it’s a great first step. Good luck!

So You’ve Signed an Alternative Quality Contract (AQC), Now What?

There have been headlines recently announcing the partnerships BCBSMA has struck with MA-based providers called Alternative Quality Contracts that seek to flip the traditional fee-for-service model on its head. Instead, these contracts advocate payments for the highest quality of care, not necessarily the most expensive. BCBSMA boasts, “The AQC now includes 23% of physicians in their network, providing care to 31% of their MA-based HMO members”. There are some recognizable names on the list of organizations already signed up including Mount Auburn, Tufts, Atrius and Caritas Christi among others totaling 9 hospitals and physician groups. I would argue, though, that signing the contract is the easiest part to this partnership (even though the lawyers for both sides may disagree). The foundation of the contracts is the quality measures that both parties agree represent the best way to monitor the quality of the care being delivered. The hard part is finding the most efficient ways to calculate and report the measures to save money and improve revenue from this new model for reimbursement. BCBSMA explicitly states, “Providers can retain margins derived from reduction of inefficiencies.” Well where are the biggest opportunities for “reducing inefficiencies”? The biggest inefficiencies lie in the collection, aggregation, and reporting of the data required to calculate these quality measures!

Almost every hospital I’ve worked, volunteered, or consulted for has the same problem – it takes too long to calculate and report quality metrics. Paper-chasing and manual chart abstraction burden overqualified nurses and delay decision makers from improving their processes. By the time the information is collected, it’s too late to make an actionable decision to either improve or standardize the best practice. Hospitals will clamor, “Our core measures are above 90%”. My response is always, “ok but you just reported June’s numbers and it’s August! And how much did it cost you to report those numbers that you now can’t do much with because you’re two months behind the information?”

In Massachusetts, the public reporting of physician and hospital performance is currently being developed. This trend will undoubtedly spread to the rest of the country, eventually. The demand for transparency of true healthcare costs and quality of care, I’d argue, has never been higher. The AQCs are a great first step in the right direction. But until hospitals and physician groups tackle the bigger problem, the huge amount of time, money and effort currently being wasted on paper-chasing and manual chart abstraction required to collect, calculate, and report their quality metrics, these contracts will only be yet another piece of paper to keep track of. Good intentions don’t equate to good business.

Healthcare Analytics – A Proven Return on Investment: So What’s Taking So Long?

So what do you get when you keep all your billing data in one place, your OR management data in another, materials management in another, outcomes and quality in another, and time and labor in yet another? The answer is…………..over 90% of the operating rooms in America!

That’s right; the significant majority of operating rooms DO NOT have an integrated data infrastructure. In the simplest terms, that means that the average OR Director/Administrator CAN’T give answers to questions like, “of all orthopedic surgeons performing surgery within your organization (single or multi-facility), what surgeon performs total knee replacements with the lowest case duration, least number of staff, lowest rate of complication, infection, and re-admission rate at the lowest material and implant cost with the highest rate of reimbursement?” In other words, they can’t tell you who their highest quality, most profitable, least risky, least costly, best performing surgeon is in their highest revenue surgical specialty. Yes, I’m telling you that they can’t distinguish the good from the bad, the ugly from the, well, uglier, and the 2.5 star from the 5 star. Are you still wondering why there is such a strong push for transparency of healthcare data to the average consumer?

You’re sitting there asking yourself, “Why can’t they answer questions like whose the least costly, most profitable and highest quality surgeon?” and the answer is simple, “application-oriented analysis”. Hospitals have yet to realize the benefits of healthcare analytics. That is, the ability to analyze information that comes from multiple sources in one location, instead of trying to coordinate each individual system analyst and have them hand their spreadsheet off to the other analyst that then adds in her data and massages it just right to hand it off to the next guy, and then….ugh you get the point. If vendors like McKesson, Cerner, and Epic could make revenue off of sharing data and “playing well with others” they would, but right now they don’t. They make their money off of deploying their own individual solutions that may or may not integrate well with other applications like imaging, labs, pharmacy, electronic documentation, etc. They will all tell you that their systems integrate, but only once you’ve signed their contract and read that most of the time, it requires their own expertise to build interfaces, so you’ll need to pay for one of their consultants to come do that for you – just ask anyone who has McKesson Nursing Documentation how long it takes to upgrade the system or how easy it is to integrate with their OR Management system so floor nurses can have the data they need on their computer screen when the patient arrives directly from surgery. Out of the box integrated functionality/capability? Easy-to-read, well documented interface specifications that a DBA or programmer could script to? Apple plug-n-play convenience? Not now, not in healthcare.

Don’t get too upset though, there are plenty of opportunities to fix this broken system. First, understand that organizations such as Edgewater Technology have built ways to integrate data from multiple systems and guess what – we integrated 5 OR systems in a 7 hospital system and they saved $5M within the first 12 months of using the solution, realizing a ROI 4 times their original cost.  Can it be done? We proved it can. So what is taking so long for others to realize the same level of cost savings, quality improvement and operational efficiency? I don’t know, you tell me? But don’t give me the “it’s not on our list of top priorities this year” or the “patient satisfaction and quality mandates are consuming all our resources” or don’t forget the “we’re too busy with meaningful use” excuses. Why? Because all of these would be achievable objectives if you could first understand and make sense of the data you’re collecting outside the myopic lens you’ve been looking through for the past 30 years. Wake up! This isn’t rocket science, we’re trying to do now what Gordon Gekko and Wall Street bankers were doing in the 80’s – data warehousing, business intelligence, and whatever other flashy words you want to call it – plain and simple, it’s integrating data to make better sense of your business operations. And until we start running healthcare like it’s a business, we’re going to continue to sacrifice quality for volume. Are you still wondering why Medicare is broke?

Move the Quality Focus to Patient Outcomes

I had the privilege to attend the Microsoft Connected Health Conference in Bellevue, Washington on May 19-20.  Microsoft changed the format of their education sessions this year to a panel discussion including short presentations.  This new format included a moderator and several views of the topic from industry experts and key people from healthcare organizations.  One of my favorite sessions was titled “Capturing Value Across the Continuum: Healthcare Quality and Outcomes.”  If you have been following the Edgewater blogs on improving Core Measures then you understand my interest.

The real take-away on this topic was the understanding that the focus on quality in healthcare has been centered more on improving business processes than improving patient outcomes.  The panel consisted of Kim Jackson, Director of Data Warehousing, St. Joseph Health System, Kevin Fahsholtz, Senior Director with Premier, Dr. Floyd Eisenberg, Senior Vice President for Health Information Technology at the National Quality Forum and Dr. Richard Chung of the Hawaii Medical Services Association.  The panel represented a Hospital Provider (Kim), an Analytics and Benchmarking company (Kevin), a healthcare standards organization (Dr. Eisenberg) and a Payer organization (Dr. Chung),all of the key aspects of the Healthcare Quality continuum and was focused on the real world challenges of improving the quality of healthcare.

The key idea of improving patient outcomes dominated the hour long discussion.  Kim White noted that “the burden (of collecting data) for a hospital is overwhelming, and measuring is overtaking the work.”  Dr. Eisenberg agreed that there was a need to move the focus of the quality measures to outcomes and away from the small process details.  He went on to say that the real issue is the definitions of the data and that the definitions need to be standardized.  In his role, Dr. Eisenberg is working to create a standard data model for quality measures and key definitions to the standards for care.  Dr. Chung pointed out that we need to change our “culture of care delivery” along with the awareness of the data.  Dr. Chung believes that providing visibility of the quality data helps set up a culture of change.  His experience shows that separating the data from the application software allows new understanding.
All of the panelists agreed that a key issue is developing the “single version of the truth” and eliminating conflicting information.  Kim White presented that using Microsoft’s Amalga UIS product allowed St. Joseph Health System to unite their data, reorganize their data and prioritize it.  She pointed out that consolidating data sources from eight locations created this “single version of the truth” and reduced the administrative burden for tracking core measures.

Our experience in improving core measures parallels this panel discussion.  Success in improving healthcare quality and outcomes involves plain old hard work – collecting the right data, with the right definition, at the right time in the process and providing it to the right people.  The need to extend the data collection to tracking outcomes beyond reporting requirements is the right idea at a right time in healthcare.  Let’s not settle for the minimal reporting requirements, but truly track outcomes and develop the feedback loops necessary to keep them successful and improving.  It is, after all, about the patient and not mere statistics.