The Unknown Cost of “High Quality Outcomes” in Healthcare

“You were recently acknowledged for having high quality outcomes compared to your peers, how much is it costing you to report this information?”

I recently read an article on healthcareitnews.com, “What Makes a High Performing Hospital? Ask Premier”. Because so many healthcare providers are so quick to tout their “quality credentials” (yet very few understand how much it costs their organization in wasted time and money running around to collect the data to make these claims) and this article sparked the following thoughts…

The easiest way to describe it, I’ve been told after many times trying to describe it myself, is “the tip of the iceberg”. That is the best analogy to give a group of patient safety and quality executives, staffers, and analysts when describing the effort, patience, time and money needed to build a “patient safety and quality dashboard”  with all types of quality measures with different forms of drill down and roll up.

What most patient safety and quality folks want is a sexy dashboard or scorecard  that can help them report and analyze, in a single place and tool, all of their patient safety and quality measures. It has dials and colors and all sorts of bells and whistles. From Press Ganey patient satisfaction scores, to AHRQ PSIs, Thomson Reuters and Quantros Core Measures, TheraDoc and Midas infection control measures, UHC Academic Medical Center measures….you name it. They want one place to go to see this information aggregated at the enterprise level, with the ability to drill down to the patient detail. They want to see it by Location, or by Physician, by Service Line or by Procedure/Diagnosis. This can be very helpful and extremely valuable to organizations that continue to waste money on quality analysts and abstractors who simply “collect data” instead of “analyze and act” on it. How much time do you think your PS&Q people spend finding data and plugging away at spreadsheets? How much time is left for actual value-added analysis? I would bet you very little…

So that’s what they want, but what are they willing to pay for? The answer is very little. Why?

People in patient safety and quality are experts…in patient safety and quality. What they’re not experts in is data integration, enterprise information management, meta-data strategy, data quality, ETL, data storage, database design, and so on. Why do I mention all these technical principles? Because they ALL go into a robust, comprehensive, scalable and extensible data integration strategy…which sits underneath that sexy dashboard you think you want. So, it is easy for providers to be attracted to someone offering a “sexy dashboard” that knows diddly squat about the foundation, or what you can’t see under the water, that’s required to build it. Didn’t anyone ever tell you “if it sounds too good to be true, it is!?”

Healthcare’s Conundrum: (IN)Decision by Committee – Good at Making Friends, NOT at Making Progress

I should start by mentioning the fact that I clearly hit a nerve on my last blog post about the huge cost “Decision by Committee” adds to the healthcare system. People agree with me, yet are hesitant about being as straightforward as I was….so be it.

Having said that, I should be straightforward about my next point – “decision by committee” impedes progress. If you know Moore’s law, or have seen the new Best Buy commercial about the “outdated world” (which I must admit is funny) you know that technology advances very quickly. Not just in retail or gaming and entertainment, but in almost every industry. Therefore, healthcare executives are inherently doing themselves a disservice by delaying their technology upgrade and new purchasing decisions. This problem isn’t restricted to just hardware and software either, but integration technology (SQL Server), business rules engines, data warehousing, knowledge management sites (SharePoint), patient relationship management applications (Microsoft CRM), patient portals, etc. By the time an organization identifies the need for new technology they have a short window to capitalize on the benefits without sacrificing some of the downsides of waiting to implement. Whether the driver is to achieve a competitive advantage, meet the demands of an evolving market place, comply with regulations, or satisfy individual stakeholders, they all would benefit from a faster implementation schedule. So why does everything take so long?

Everyone knows time is money. The problem is no one is cognizant of the opportunity cost associated with delayed and prolonged decision making. They think the money clock starts ticking once the project starts. What an outdated way of managing! The clock starts ticking as soon as you’re organization has agreed that the need exists and you need to find someone or something to meet it! This isn’t rocket science people.

“Progress” in the context of this blog is when healthcare finally starts to achieve the efficiencies from utilizing IT that retail, banking, and even life sciences did 20 years ago. The main point we should all agree on: “healthcare should be run like a business” and the last two blogs I’ve written speak directly to this. If for some reason you think this is a bad idea because “it takes away from the focus on the patient” then stop reading because I know you don’t work in healthcare or understand where the inefficiencies in the system lie and we shouldn’t be talking anyway.

Unfortunately, efficient and appropriate decision making is an important organizational component that is not characteristic of large committees in healthcare organizations.  There is typically a concern that too much risk may be made that could compromise patient care or safety.  However the opportunity lost with indecision may be as much or more costly.

Healthcare’s Conundrum: (IN)Decision by Committee – Good at Making Friends, NOT at Making Sense

As anyone in sales, or consulting, or technology, or materials management, or vendor hardware, or you name it will tell you, the healthcare industry has a ridiculously long sales cycle. It takes months and even years to get approval on even the most basic goods and services. Bedpans, paint colors, implants, EMRs, servers, everything! Why? Because everything (and I really mean everything) is decision by committee. Good for fostering relationships, getting everyone’s buy-in, and singing “Kumbaya”….Bad for business.

Recent headlines and national debates have centered on the “rising costs of healthcare” as the Baby Boomers start and continue to retire in record-breaking numbers. Yes there are ways to cut costs in almost every facet of healthcare. Why not start with the continuously rising cost of making decisions? The average American will never hear about this cost because they are not exposed to the inner-workings of the healthcare industry. But ask anyone who actually works in healthcare, and they’ll be the first to admit that it simply takes way too long to make decisions, at every level in the organization. Everyone in the industry is a culprit – IT, doctors, researchers, nurses, administration, finance, and of course let’s not forget procurement. Talk about the left hand not knowing what the right hand is doing. If I had a nickel for every time a business or IT executive told me to avoid procurement at all costs, well I’d be broke because I would’ve invested it in the stock market, but that’s another story. I would estimate that the time it takes, on average, to get signature on contracts in healthcare adds anywhere from 10% to 40% of an additional cost to the actual project. This is not chump change; this is hundreds of thousands of dollars!

So, why does this inefficient, ineffective process persist? Well for some it’s simply job security, but for the industry as a whole, I’m not sure. Why is it that healthcare provider’s pay executives, top managers, and other leaders tons of money only to limit their ability to lead? It is not difficult to identify the most qualified person in the room. Let him or her make the decision. Yes, supporting details are needed and blah blah blah…my point is this – it should not take 4 monthly steering committee meetings, 12 operational committee meetings, hundreds of back and forth emails and spreadsheets, and a few executive or board presentations sprinkled in to determine what should’ve been obvious within the first few days (maybe weeks depending on the complexity of the decision). If you can’t make a decision, find someone that will.  And then stop asking for more time, more “phone calls to discuss”, or a delayed start date. Why? – because it’s wasting your money and my time.

Strategic Finance for Service Lines: Finding Opportunities for Growth

Healthcare providers are always seeking innovations and evaluating strategic alternatives to meet growing demand while healthcare legislation is adding challenges to an already complex industry. As the population continues to age and development increases the demand for high quality healthcare, providers must put themselves in the optimal financial position to deliver the best care to the communities that depend on them.

To do this, many are turning to a service line model so that they can identify profitable areas of their organization that will generate future growth and capture market share.  In order to identify the strategic value of each service line, organizations need to have a long-range planning tool that will enable them to quickly forecast each of their service lines over the next 3-5 years and evaluate growth areas so that investments can be made in the service lines that will generate the greatest long-term economic value for the organization.

Utilizing Oracle’s Hyperion Strategic Finance, Edgewater Ranzal has helped many organizations chart a realistic financial plan to achieve their long-range goals and vision.  Some of the ways that we have helped organizations are as follows:

  • Forecast detailed P&Ls  for each service line using revenue and cost drivers such as number of patients, revenue per procedure, FTE’s, and payer mix to accurately forecast profit levels of each service line.
  • Easily consolidate the forecasted service line P&Ls to view the expected financial results at a care center level or for the Healthcare organization as a whole.
  • Layer into the consolidation structure potential new service lines that are being evaluated to understand the incremental financial impact of adding this new service line.
  • Run scenarios on the key business drivers of each service line to understand how sensitive profitability, EPS, and other key metrics are to changes in variables like number of patients, payer mix, FTE’s and salary levels.
  • Compare multiple scenarios side by side to evaluate the risks and benefits of specific strategies.
  • Evaluate the economic value of large capital projects needed to grow specific service lines beyond their current capacity.  Compare the NPV and IRR of various projects to determine which ones should be funded.
  • Layer into the consolidation structure specific capital projects and view their incremental impact on revenue growth and profitability at the service line level as well as the healthcare organization as a whole.
  • Use the built in funding routine of HSF to allocate cash surpluses to new investments and to analyze at what point in time the organization is going to need to secure more debt financing to fund its operations and its capital investments in specific service lines.

Regardless of where you are in your understanding, analysis, or implementation of service lines, a viable long-term strategy must include a critical evaluation of how will you identify the market drivers for growth, measure sustainable financial success, and adjust to changing economic, regulatory, and financial conditions.

So You Think an Accountable Care Organization (ACO) is a Good Idea – First Things First, What’s your Data Look Like?

I will not pretend to know more about Accountable Care Organizations (ACOs) than Dr. Elliot Fisher of Dartmouth Medical School who is the unofficial founder of the ideas behind this new model for healthcare reform. But if I may, I’d like to leverage some of his ideas to outline the necessary first step for creating a new ACO or being included in one of the many that already exist.

First and foremost, I understand that there are very smart people out there who insist ACOs are a bad idea (click here and here to read strong arguments against ACOs). Having said that, there are fundamental aspects of ACOs that are prerequisites for success; one of these is the ability to calculate mandated performance metrics and share this data electronically with other members of the ACO. How else are they going to know if it’s working (and by working I mean lowering costs while improving the quality of care)?

Healthcare providers are used to having to calculate quality metrics, like Core Measures amongst others, for the purposes of demonstrating high quality care, being compliant with regulator mandates, and satisfying their patient populations. What they’re not used to is having to report in a timely fashion. Hospitals routinely, for instance, report CMS core measures months after the patient encounters actually occurred. The previous 3 clients I worked with reported March/April/May measures in August. The Senate Bill that allows for CMS to contract with ACOs specifies the need to share performance metrics among participating entities but leaves the how and how often to each ACO to decide. This could be extremely problematic considering the huge discrepancy across our provider networks of the necessary healthcare IT infrastructure to gather, calculate, and report these metrics across care settings in a timely manner.

The first thing to consider when contemplating participation in an ACO is, “How robust is your data infrastructure and can you meet the reporting requirements mandated for any ACO participation?” Dr. Fisher points out, “We have been collaborating withCMS, private health plans, and medical systems to identify and support the technical infrastructure needed to deliver timely, consistent performance information to ACOs and other providers.” If you think your paper-chasing and manual chart abstraction that gets you by today for most reporting requirements will fly, think again. An ACO will only be as strong as its weakest link. A successful ACO is able to monitor its progress against the benchmarks established for total cost for delivering healthcare services per enrollee. The overall goal is to lower the cost to provide services while maintaining a high level of quality, and subsequently share the cost savings. There are other similar models such as the Alternative Quality Contracts (AQCs) currently rolled out by BCBSMA, with similar criteria and financial incentives. In both cases, though, the fundamental data infrastructure is required to meet the stringent reporting requirements. In addition, as ACOs gain traction and identify new ways to lower the cost of providing care, the need for a robust reporting infrastructure to eliminate the tremendous amount of time and money spent on collecting, calculating, reporting and distributing information including quality, operational, clinical, and financial metrics becomes even more instrumental.  The best case scenario also includes an evolution to healthcare analytics when analysis of data spans care settings, siloed applications, and even facilities (Chet Speed, the American Medical Group Associations VP, Public Policy agreed with me on this point in his recent interview with Healthcare Informatics). But first things first.

You can do a lot to improve your chances of success within an ACO; start with understanding the requirements for sharing discrete, accurate, consistent data, it’s a great first step. Good luck!

Healthcare Analytics – A Proven Return on Investment: So What’s Taking So Long?

So what do you get when you keep all your billing data in one place, your OR management data in another, materials management in another, outcomes and quality in another, and time and labor in yet another? The answer is…………..over 90% of the operating rooms in America!

That’s right; the significant majority of operating rooms DO NOT have an integrated data infrastructure. In the simplest terms, that means that the average OR Director/Administrator CAN’T give answers to questions like, “of all orthopedic surgeons performing surgery within your organization (single or multi-facility), what surgeon performs total knee replacements with the lowest case duration, least number of staff, lowest rate of complication, infection, and re-admission rate at the lowest material and implant cost with the highest rate of reimbursement?” In other words, they can’t tell you who their highest quality, most profitable, least risky, least costly, best performing surgeon is in their highest revenue surgical specialty. Yes, I’m telling you that they can’t distinguish the good from the bad, the ugly from the, well, uglier, and the 2.5 star from the 5 star. Are you still wondering why there is such a strong push for transparency of healthcare data to the average consumer?

You’re sitting there asking yourself, “Why can’t they answer questions like whose the least costly, most profitable and highest quality surgeon?” and the answer is simple, “application-oriented analysis”. Hospitals have yet to realize the benefits of healthcare analytics. That is, the ability to analyze information that comes from multiple sources in one location, instead of trying to coordinate each individual system analyst and have them hand their spreadsheet off to the other analyst that then adds in her data and massages it just right to hand it off to the next guy, and then….ugh you get the point. If vendors like McKesson, Cerner, and Epic could make revenue off of sharing data and “playing well with others” they would, but right now they don’t. They make their money off of deploying their own individual solutions that may or may not integrate well with other applications like imaging, labs, pharmacy, electronic documentation, etc. They will all tell you that their systems integrate, but only once you’ve signed their contract and read that most of the time, it requires their own expertise to build interfaces, so you’ll need to pay for one of their consultants to come do that for you – just ask anyone who has McKesson Nursing Documentation how long it takes to upgrade the system or how easy it is to integrate with their OR Management system so floor nurses can have the data they need on their computer screen when the patient arrives directly from surgery. Out of the box integrated functionality/capability? Easy-to-read, well documented interface specifications that a DBA or programmer could script to? Apple plug-n-play convenience? Not now, not in healthcare.

Don’t get too upset though, there are plenty of opportunities to fix this broken system. First, understand that organizations such as Edgewater Technology have built ways to integrate data from multiple systems and guess what – we integrated 5 OR systems in a 7 hospital system and they saved $5M within the first 12 months of using the solution, realizing a ROI 4 times their original cost.  Can it be done? We proved it can. So what is taking so long for others to realize the same level of cost savings, quality improvement and operational efficiency? I don’t know, you tell me? But don’t give me the “it’s not on our list of top priorities this year” or the “patient satisfaction and quality mandates are consuming all our resources” or don’t forget the “we’re too busy with meaningful use” excuses. Why? Because all of these would be achievable objectives if you could first understand and make sense of the data you’re collecting outside the myopic lens you’ve been looking through for the past 30 years. Wake up! This isn’t rocket science, we’re trying to do now what Gordon Gekko and Wall Street bankers were doing in the 80’s – data warehousing, business intelligence, and whatever other flashy words you want to call it – plain and simple, it’s integrating data to make better sense of your business operations. And until we start running healthcare like it’s a business, we’re going to continue to sacrifice quality for volume. Are you still wondering why Medicare is broke?

Driving Value from Your Healthcare Analytics Program

If you are a healthcare provider or payer organization contemplating an initial implementation of a Business Intelligence (BI) Analytics system, there are several areas to keep in mind as you plan your program.  The following key components appear in every successful BI Analytics program.  And the sooner you can bring focus and attention to these critical areas, the sooner you will improve your own chances for success.

First, a small note on terminology:  We often hear the term “business intelligence” used as the overarching label for these now-familiar graphical dashboard UIs that enable direct, end-user access to actionable metrics, insightful analytics and the rich, multi-dimensional substantiating data that underlie the top-level presentation.  Interactive drill-down; rules-based transformation of data, derivation of facts, and classification of events; hierarchical navigation; and role-based access to progressively disaggregated and increasingly identified data sets are some of the commonly implemented capabilities.  However, we have found that most client organizations will re-cast the “BI” moniker to a more subject- or mission-related name for the system and its intended objectives.  “Clinical Intelligence”, “Quality Analytics” and “Financial Performance Management” are examples.  These names are chosen to apply more directly to the mission, focus and objectives of the specific subject area(s) for which the system is being designed and deployed.  But they often have similar expectations with regard to the data and the end-user capabilities that will be present, and the following principles equally.

Key Technical Components

Often, the first questions the organization asks focus on the technical components; the platforms, tools and applications that will form the eventual solution.  Which ETL tools should we use?  Which cube design tools are best for our needs?  How do we integrate with standard vocabulary services?  What do we use to design and deploy our dashboards?  These “how to” choices will follow consideration of the “what” needs for the system.

The key technical macro-components for a BI analytics system will invariably include one form or another of each of the following capabilities:

The primary components for data receipt and uptake, most frequently receiving raw transaction data from upstream (source) operational and transactional (OLTP) systems, such as an electronic medical record (EMR) system, surgery scheduling system, billing and reimbursement system, or claims processing system.  The portfolio of source systems can be diverse and extensive, and can include the messaging traffic that travels between and synchronizes these individual systems.  Primary data that is captured and exchanged can range from standard, discrete data elements, to less structured collections (e.g. documents), to diverse binary formats originating from virtually any point of care device type.  The primary purpose of these components is to capture the raw data (and/or meta-data) elements that reflect the domain-specific (e.g. clinical, financial) operational or event context in which the primary data originated, propagating this data downstream for consumption in an analytic context.

Data obtained from a mission-focused transactional system will almost invariably need to be computed, mapped, translated, combined, aggregated, aligned or otherwise transformed to enable its consumption for more analytic (OLAP) applications.  Mapping the raw source data elements onto a standard taxonomy and aligning them with a designated ontology or other conceptual model of the subject domain can enhance the stability and extend the useful life of your data.  Various forms of derived data elements will arise, including various asymmetric relationships between elements existing in a data lineage.  This serves to enrich the raw source data, both increasing its relevance and improving its consume-ability for the anticipated (and often unanticipated) analytic contexts.

Pipelining raw data elements through various “enrichment engines” can increase their value and usefulness to a broader set of audiences.  Mapping atomic-level clinical procedure encodings to higher levels in a hierarchy; assigning cases to service lines and resolving overlap conflicts to support differing analytic objectives; and linking isolated events to a network of terms in a standard vocabulary are all examples that can improve the consume-ability of individual or collections of data elements.

The volumes of data gathered and organized into an enterprise-scale data warehouse (EDW) or other integrated repository will require a spectrum of storage approaches to meet the competing demands of user consumption patterns and dynamics.  The demands of some data consumers will require operational data stores (ODS) that combine data from multiple sources, and deliver it to the point of consumption in a timely manner, using various high-throughput staging strategies to align key elements and minimize the latency between the occurrence of the primary event and the availability of the combined data in the ODS.  EDWs integrate (and often rationalize) information from multiple sources (and potentially multiple ODSs) often according to a comprehensive universal data model that reflects the primary data entities and their relationships, facilitating and enhancing the future consumption of the data in a wide variety of unanticipated use cases.  Data cubes can emphasize and optimize the dimensional characteristics of the data entities, and facilitate the hierarchical segmentation, navigation and exploration of the captured subject domain.

Unstructured data elements or collections, including binary data types (e.g. images, voltage tracings, videos, audios, etc.) present different storage requirements, often including explicit separation of indexing and other meta-data from the primary data.

The delivery of data is complete when the end-consumer has gained access to the desired data sets; using the required retrieval, analytics and presentation tools or applications; under the proper controls.  The user experience might include dashboards or other interactive graphical data displays and navigational UIs; rule-driven alerts to notify critical parties of specific conditions or escalating events; analytic and predictive models for exploring hypotheses or projecting “what if” scenarios; and communication tools for distributing, and corresponding with other stakeholders or trading partners on the underlying issues reflected in the information being consumed and even tracking their resolution.

These are the recurring, primary, top-level building blocks.  My next blog will delve into the program components you can apply to drive the definition, design and implementation of your analytics system.

Are you positioned for success?

Successful implementation of BI analytics requires more than a careful selection of technology platforms, tools and applications.  The selection of technical components will ideally follow the definition of the organization’s needs for these capabilities.  The program components outlined next time will offer a start on the journey to proactive embedded analytics, driving the desired improvement throughout your enterprise.

Tackling the Tough One: Master Data Management for the Healthcare Enterprise

One of the big struggles in healthcare is the difficulty of Master Data Management.  A typical regional hospital organization can have upwards of 200+ healthcare applications, multiple versions of systems and, of course, many, many “hidden” departmental applications.  In that situation, Master Data Management for the enterprise as a whole can seem like a daunting task.  Experience dictates that those who are successful in this effort start with one important weapon: data and application governance.

Data and application governance can often be compared to building police stations, but it is much more than that.  Governance in healthcare must begin with an understanding of data as an asset to the enterprise.  For example, developing an Enterprise Master Patient Index (EMPI) is creating a key asset for healthcare providers to verify the identity of a patient independent of how they enter the healthcare delivery system.  Patients are more than a surgical case, an outpatient visit or pharmacy visit.  Master data management in healthcare is the cornerstone of moving to treating patients across the entire continuum of care, independent of applications and location of care.  Bringing the ambulatory, acute care and home care settings into one view will provide assurance to patients that a healthcare organization is managing the entire enterprise.

Tracking healthcare providers and their credentials across multiple hospitals, clinics and offices is another master data management challenge.  While there are specialized applications for managing doctor’s credentials, there are not enterprise-level views that encompass all types of healthcare professionals in a large healthcare organization and their respective certifications.  In addition, this provider provisioning should be closely aligned with security and access to protected healthcare information.  A well designed governance program can supervise the creation of this key master data and the integration across the organization.

An enterprise view of Master Data provides a core foundation for exploiting an organizations data to its full potential and offers dividends beyond the required investment.  Healthcare organizations are facing many upcoming challenges with reference data as a part of master data management, especially as the mandated change from ICD-9 to ICD-10 codes approaches.   Hierarchies are the magic behind business analytics – the ability to define roll-up and drill-downs of information.  Core business concepts should be implemented as master data – how does the organization view itself?  The benefits of a carefully defined and well governed master data management program are many: Consistent reporting of trusted information, a common enterprise understanding of information, cost efficiencies of reliable data, improved decision making from trusted authoritative sources, and most importantly in healthcare, improved quality of care.

Data and application governance is the key to success with master data management.  Just like an inventory, key data elements, tables and reference data must be cataloged and carefully managed.  Master data must be guarded by three types of key people: a data owner, a data steward and a data guardian.  The data owner must take responsibility for the creation and maintenance of the key asset.  The data steward will be the subject matter expert that determines the quality of the master data and its appropriate application and security.  Finally, the data guardian is the information technology professional that oversees the database, the proper back-up and recovery of the data assets and manages the delivery of the information.  In all three roles, accountability is important and overseen by an enterprise information management (EIM) group that is composed of key data owners and executive IT management.

In summary, master data management provides the thread that ties all other data in the enterprise together.  It is worth the challenge to create, maintain and govern properly.  For success, pick the right people, understand the process and use a reliable technology.