The Politics of Data in an ACO

Data sharingImagine the following scenario. You discover that you are the victim of identity theft, purchases have been made in your name, and your personal credit has been ruined. You are saved, though. You have paid a watchdog organization to monitor your credit, and they have information that clears your good name! So, when you apply for a loan with a bank, you request the credit monitoring agency to share the details of your prior credit problems and its resolution with the bank. But the monitoring agency will not share that information because that might help the bank understand your needs and negotiate a better price for their own credit monitoring service that they resell to their customers – i.e., you. The monitoring service won’t release your information.

Would you put up with this conflict of interest? NO!

In healthcare, we routinely tolerate a form of this conflict of interest, and in many different forms. Even though health insurers are not providing the patient care directly, these payers tend to accumulate a very useful holistic view of each patient’s history, including information from the perspective of what care was provided based on payment being demanded by many different care providers. There are numerous instances where, if this information was shared with other providers, it could positively impact the care management plan, doing so in a more timely manner, and increasing the likelihood of improving the quality of care the patient receives and possibly reducing the overall cost of care across an extended episode.

Here is an example- A patient is admitted to the hospital and receives a pacemaker to address his atrial fibrilation. After being discharged, the patient follows up with his cardiologist who has reduced the dose of digoxin, having diagnosed the patient with a digoxin toxicity. However, the patient attempts to save a few dollars by finishing their current prescription only to be admitted to the hospital a couple of weeks later for the toxicity. This is an opportunity where the care manager could have intervened based on the cardiologist’s toxicity diagnosis being submitted to the payer and no prescription was filled within a few days. The care manager could have helped the patient be more compliant with the cardiologist’s instructions avoiding an inpatient admission.

Healthcare provider organizations and payers (and in some cases regulators) are working together to break down these walls in an effort to increase value across the spectrum of care delivery and the industry in general. However, the sometimes conflicting vested interests of these interacting payers and providers can still be an obstacle, influencing the politics of information disclosure and sharing in the emerging environment of accountable care delivery models.

There is great diversity in the participating organizations that collaborate to make up an ACO. This is definitely not one size fits all. Viewed from the perspective of sharing risks across parties without the immediate concern about maximizing volumes, the integrated provider-based health plans, such as Kaiser Permanente, Geisinger Health System, and Presbyterian Healthcare Services, are already inherently sharing this risk and are reaping the rewards as a single organization. That’s great for the few organizations and patients that are already members participating in one of these plans.

Unfortunately, for other organizations there is still much more to be worked out regarding proactive sharing of data both within an accountable network of providers acting across care settings, and with the payer(s). Within the network, hospital systems usually have some of the infrastructure in place and they know how to routinely share data between systems and applications using standard data exchange conventions such as HL7 and CCD. In collaboration with HIE’s these systems can help facilitate active data distribution, and they very often provoke the organization to address some of the more common aspects of data governance. However, even when this routine “transactional” and operational data is being exchanged and coordinated, there is still a great unmet need for the ACO to buy or build a data repository for the integration and consumption of this data to support reporting and analytics across various functional areas.

Many organizations encounter further challenges in defining and agreeing on which are the authoritative sources of specific elements of data, what are the rights and limits on the use of these data, and how can these assets be used most effectively to facilitate the diverse objectives of this still-emerging new organizational model.

An even greater challenge for some ACOs is collecting the required data from the smaller participating provider networks. These organizations often have less capability to customize their EHRs (if they even have EHRs in place) and less resource capacity to enable the data sharing that is required. To get around this, some ACOs are:

  • Standardizing on a small number of EHRs- (ideally one, but not always possible) This provides the potential to increase economies of scale and leverage the shared learnings across the extended organization.
  • Manually collecting data in registries- Although not always timely, this addresses some rudimentary needs for population-focused care delivery and serves to overcome common barriers such as the willingness of a given provider to collect additional required data and complying with standards.
  • Not collecting desired data at all- While this seems hazardous, progress toward the overall clinical and/or financial goals of the ACO can still be positive, even if an organization cannot directly attribute credit for beneficial outcomes or improvements within the organization, and the ACO can avoid the overhead of collecting and manually managing that data.

Regardless of what data is collected and shared within the ACO, the payers participating often have the highest quality, most broadly useful longitudinal data because:

  1. The data is ‘omniscient’ – it represents, in most cases, all of the services received by (or at least paid for) that patient – provided a claim for those services has been submitted and paid by the participating payer.
  2. Some of this data is standardized and consolidated making it easier to manage.
  3. The data is often enriched with additional data residing in mature information systems such as risk models, and various disease-focused or geographic populations and segments.

Consequently, payer data very often forms the longitudinal backbone that most consistently extends across the various episodes constituting a patient’s medical history and is very important to the success of the ACO’s mission to drive up quality and drive down costs. Despite this opportunity for an ACO to improve its delivery of care to targeted populations, sharing of data is still achieved unevenly across these organizations because some payers feel the utilization, cost and performance data they have could be used to negatively impact their position and weaken their negotiations with the hospitals and other provider organizations.

While claims have traditionally been the de facto standard and basis for many of the risk and performance measures of the ACO, more progressive payers are also now sharing timely data pertaining to services received outside the provider network, referrals between and among providers, authorizations for services, and discharges, further enabling ACOs to utilize this information proactively to implement and measure various improvements in care management across the spectrum of care settings visited by patients under their care.

Collaboration between provider organizations and payers at a data level is moving in a positive direction because of the effort given to ACO development. These efforts should continue to be encouraged so as to realize the possibilities of leveraging timely distribution of data for better treatment of patients and healthcare cost management.

What I learned at HFMA’s Revenue Cycle Conference at Gillette Stadium

(…while the Patriots prepared to get their butts kicked)

Right from Jonathan Bush, the co- founder and CEO of athenahealth [the keynote speaker]: “Make Hospitals Focus on What They’re Good At – Everything Else, “Seek Help!” I can help define “everything else”. For now, I will keep it generally confined to the world of healthcare data – because I would argue more time, money, and effort is wasted on getting good data than almost any other activity in a hospital.

If you are a Chief Quality Officer, or Chief Medical Informatics Officer, or Chief Information Officer – what would you rather spend your budget on?

data analysisYour analysts collecting data – plugging away, constantly, all-day into a spreadsheet?

Outcomes: Stale data in a static spreadsheet…that probably needs to be double/triple-checked…that probably is different than what the other department/analyst from down the hall gave you…that you probably wouldn’t bet your house on is accurate.

Or your analysts analyzing data and catalyzing improvement with front line leaders?

Outcomes: Real time data in a dynamic, flexible multi-dimensional reporting environment…that can roll up to the enterprise level…and drill down to the hospital → unit → provider → patient level.

Here’s a hint – this isn’t a trick question. Yet, for some reason, as you read this, you’re still spending more money on analysts reporting stale, static, inaccurate data than you are on analysts armed with real time data to improve the likelihood of higher quality and patient satisfaction scores and improved operational efficiency.

The majority of the speakers at this year’s HFMA Revenue Cycle conference seemed to accept that providers are NOT good at collecting and analyzing data, or using it as an asset to their advantage. They also seemed to align well with other speakers I’ve heard recently at HIT conferences. If you’re like 99% of your colleagues in this industry, you probably don’t understand your data either. So do what Jonathan Bush said and GET HELP!

What I Learned Last Week in Cambridge, MA at the World Congress Health Care Quality Conference

The subtitle for last week’s conference was “Moving from Volume to VALUE Based Care”. The theme’s that emerged from the speaker panels, presentations, and one-off conversations I had seemed well aligned:

  1. Healthcare is currently experiencing a paradigm shift from the traditional provider-centric mentality to that of a patient-centric framework
  2. One of the biggest challenges providers face in the pursuit of higher quality is figuring out how to appropriately leverage all of the data they’re currently collecting, manually and electronically
  3. Emerging opportunities for reigning in costs and improving quality including ACO’s, AQC’s, PCMH’s, and others will only be effective if there are standards for implementation and measuring effectiveness consistently across the country
  4. There are a handful of healthcare providers and payers who have taken significant strides in controlling costs while improving quality by implementing technology solutions that integrate data from across the continuum of patient care.

I was encouraged by the level of enthusiasm in the room. Dr. Allan H. Gorroll from Massachusetts General Hospital and Harvard Medical School made it clear that advancing the quality agenda will require significant investments in primary care; Dr. Kate Koplan spoke about Atrius Health’s push to reduce the problems of over testing and unnecessary treatments; Dr. John Butterly from Dartmouth Hitchcock Health discussed the Patient Centered Medical Home (PCMH) and suggested to all providers that they “have a patient on the team responsible for understanding how to establish the PCMH”; and Micky Tripathi the President and CEO of Massachusetts e-Health Collaborative mentioned the challenges of turning data into actionable information with problems like free text data, inconsistent data collection across care settings and the fear many clinicians have of “change” getting in the way.

I too was a co-presenter at the conference and was delighted by the response to our presentation. My counterpart Neil Ravitz, Chief Operating Officer for the Office of the Chief Medical Officer at the University of Pennsylvania Health System, and I discussed a recent solution we designed and implemented. We were able to automate the collection, integration, calculation, presentation and dissemination of 132 inpatient safety and quality measures across 3 hospitals and 7 source application systems. This new tool consolidates measures from across these hospitals and systems into one place for reporting and analysis through the use of dashboards and dynamic, drill down reports. The major benefits of the solution include:

  1. Changed the focus of quality and decision support analysts from data production to data analysis and action;
  2. Automated quality data collection to enable better accuracy and more timely data; and
  3. Enabled a faster quality improvement cycle time by front line leaders

Dr. Atul Gawande recently suggested in an article in the New Yorker that healthcare should be prepared to start implementing standards for nearly all of the care delivered, from total hip replacements to blood transfusions. As we all know, he is a fan of checklists, one logical tool for standardization. He also states, “Scaling good ideas has been one of our deepest problems in medicine”. When I attend healthcare conferences like the one last week in Cambridge, I’m excited by the progress I see organizations making. When I leave the conference though, I’m quickly reminded of the grim reality of healthcare and Dr. Gawande’s point. And then I wonder, at what point will “patient centric”, “accountable care”, “value based purchasing” and all the other catch phrases of the past few years become the industry standard – and not the exception limited to conferences, New Yorker magazines, and headlines that are only ever heard or read, and rarely ever experienced.

The Unknown Cost of “High Quality Outcomes” in Healthcare

“You were recently acknowledged for having high quality outcomes compared to your peers, how much is it costing you to report this information?”

I recently read an article on healthcareitnews.com, “What Makes a High Performing Hospital? Ask Premier”. Because so many healthcare providers are so quick to tout their “quality credentials” (yet very few understand how much it costs their organization in wasted time and money running around to collect the data to make these claims) and this article sparked the following thoughts…

The easiest way to describe it, I’ve been told after many times trying to describe it myself, is “the tip of the iceberg”. That is the best analogy to give a group of patient safety and quality executives, staffers, and analysts when describing the effort, patience, time and money needed to build a “patient safety and quality dashboard”  with all types of quality measures with different forms of drill down and roll up.

What most patient safety and quality folks want is a sexy dashboard or scorecard  that can help them report and analyze, in a single place and tool, all of their patient safety and quality measures. It has dials and colors and all sorts of bells and whistles. From Press Ganey patient satisfaction scores, to AHRQ PSIs, Thomson Reuters and Quantros Core Measures, TheraDoc and Midas infection control measures, UHC Academic Medical Center measures….you name it. They want one place to go to see this information aggregated at the enterprise level, with the ability to drill down to the patient detail. They want to see it by Location, or by Physician, by Service Line or by Procedure/Diagnosis. This can be very helpful and extremely valuable to organizations that continue to waste money on quality analysts and abstractors who simply “collect data” instead of “analyze and act” on it. How much time do you think your PS&Q people spend finding data and plugging away at spreadsheets? How much time is left for actual value-added analysis? I would bet you very little…

So that’s what they want, but what are they willing to pay for? The answer is very little. Why?

People in patient safety and quality are experts…in patient safety and quality. What they’re not experts in is data integration, enterprise information management, meta-data strategy, data quality, ETL, data storage, database design, and so on. Why do I mention all these technical principles? Because they ALL go into a robust, comprehensive, scalable and extensible data integration strategy…which sits underneath that sexy dashboard you think you want. So, it is easy for providers to be attracted to someone offering a “sexy dashboard” that knows diddly squat about the foundation, or what you can’t see under the water, that’s required to build it. Didn’t anyone ever tell you “if it sounds too good to be true, it is!?”

Healthcare’s New Mantra

Reduce Costs;
Improve Outcomes & Quality; Increase Revenue & Growth

Everything we do for our healthcare clients’ improves these fundamental core principles – Everything! I mean it, seriously, we have a history of delivering innovative solutions to common problems and each one of them helps accomplish these goals.

REDUCE COSTS: I know you have too many people collecting and scrubbing data – patient safety data, quality data, financial data, operational data….and so on. I also know you pay these people too much money to just be data collectors. Stop wasting your money and their skill sets. Data collection should be a commodity, it’s definitely NOT a competitive advantage. We’ll integrate your data, clean it up before it’s used, and present it in a way that is intuitive and actionable. We’ve done it before and guess what happened….yup $$$$ Millions $$$$$ of dollars saved.

IMPROVE OUTCOMES: I know you spend the majority of your time collecting data, leaving very little time to analyze and act on it. Your patients don’t benefit from data collection. They benefit from your ability to take the data you’ve collected, interpret it, and embed the best practices you’ve uncovered back into the clinical workflows. They also rely on you to identify areas of improvement to educate clinicians before a small problem turns into a big lawsuit. Let us enable advanced analytics with strong data governance to improve clinical processes across the continuum of patient care.

IMPROVE QUALITY: Question: Are you quality driven or compliance driven? Ok now be honest with yourself and answer again. You can have the best processes in the world in place to massage your numbers and report out to CMS in a timely and efficient manner but guess what, that doesn’t translate into better outcomes. BUT…if you have the processes in place to ensure high quality outcomes, your quality numbers will naturally improve. Outcomes first! We’ll align your data needs with your reporting needs, automate the collection and aggregation, and put data in the hands of people who know what to do with it…(before the patients are discharged).

INCREASE REVENUE: Do you know where your high revenue drivers lie? What procedures physicians, payers, discharge service codes, and DRG’s make you the most money? Can you plan and forecast your net patient revenue based on these changing dimensions and their expected volume 3, 6, 9 months out? If you can, congratulations you’re one step ahead of your competition. If you can’t, we can help you accomplish all of these goals as well as any other need your CFO and Strategic Planners have.

GROW: Do you want to track where you patient referrals are coming from to get a better ROI on your marketing dollars? We’ve implemented healthcare XRM (the “X” is for any stakeholder group – patients, physician groups, managed care plans, you name it) to tie the marketing campaign directly to the patient visit.

The Struggle to Define Quality Measures: Try Finding the “True Source” of Your Data First

Let’s look at one measure that is extremely important to hospitals right now: How do you calculate Patient Re-Admission rates?

  • Do you break it down by certain characteristics of a Re-Admission like where the patient came from (“Admission Source”)?
  • Do you make sure to exclude any patients that expired (“Discharge Disposition” = expired) so your numbers aren’t inflated?
  • Do you include certain patient types and exclude others (include “Patient Type” = Inpatient or Psych; exclude “Patient Type” = Rehab)?
  • Do you exclude outliers like the patient that has been in your hospital for what seems like forever but goes back and forth from home health and nursing clinics to the hospital?

Ok, let’s say you do take all these nuances into consideration when calculating your Re-Admission rate……now let me ask you this — what source are you using to collect these various data elements?

  • Your ADT / registration system?
  • Your billing system?
  • How about the patient tracking system
  •  Or the case management system?
  • How about your brand new shiny EMR?

The scenario I describe above is very real. It is highly likely that the data elements needed to calculate Re-Admission rates are scattered in multiple systems across the enterprise and to make matters worse, the same data elements like “Discharge Disposition” and “Admission Source” can be found in multiple systems! Now extrapolate this problem over nearly ALL of your quality measures and you have the conundrum that Patient Safety and Quality departments struggle with in every hospital across the country. I consistently hear:

  • “How do I know everyone is calculating infection rates [CLABSIs, VAPs, CAUTIs] the same way?”
  • “How can I be sure we’re identifying the appropriate Pressure Ulcers stages consistently across units?”
  • “How can I standardize the collection and reporting of NDNQI? PQRI? AHRQ PSIs? Across my units and entities (for multi-facility organizations)?”

The answer to these questions often starts with knowing where the most reliable source of the individual data elements needed to calculate these measures sit. It gets trickier though because even when you find the best data source, you then have to be sure that the data is being entered consistently across your user community. That means:

  • Is the data being entered at the same time?
  • Is each data field restricted to certain values? Are the users entering these values or free-text?
  • Is the data being entered by the same person / role? With the same level of experience and expertise (this is especially critical in the case of identifying infections and pressure ulcers)?
  • Is the data being entered manually? Is it entered in multiple places?
  • Are there spreadsheets and documents that have this data that are paper sources and therefore, not able to be automatically data-mined?
  • Do your users understand the importance of standardizing this process? Do they understand the value it provides their organization and thus their patients?

The answer to the last question often eludes many clinicians I run into. It is sometimes difficult for someone who has been clinically trained their entire career to understand the power of discrete data over free-text narrative documentation.

One exercise we have found extremely helpful for our clients is creating a source-to-measure mapping document that identifies the agreed upon sources of the individual data elements (both numerator and denominators) for each quality measure being reported across the enterprise. Once this is created, get your clinicians, nursing, and analysts to bless it and finally publish it for reference so there is no ambiguity moving forward. Now everyone is reporting the same data with the same definitions. The most difficult part, though, comes as you have to hold people accountable for changing their practices to improve patient outcomes once you’re all reporting in the same language.

The Never-Ending Burden of Reporting Patient Safety & Quality Metrics

Quick: how long does it take you to collect, aggregate, and report your SCIP, PN, AMI, and HF Core Measures? How about infection control metrics like rates of CLABSI, VAP, UTI, and MRSA? Or for that matter, any patient safety and quality metric that is mandated by JCAHO, CMS, your Department of Health, or anyone else? If you answered anything less than 2 months, and if I was a betting man, I’d bet you were lying.

There is a never-ending burden strapped to the backs of hospitals to collect, aggregate, analyze, validate, re-analyze, re-validate, report, re-validate, report again….quality measures. Reporting of these quality metrics is meant to benchmark institutions across the industry on their level of care, and inform patients of their treatment options. Fortunately for the majority of institutions, it is not difficult to achieve a high rate of compliance (>80-90%) because clinicians genuinely want to provide the best standards of care. Unfortunately though, the standards for achieving the highest designation according to CMS guidelines (achieving top percentile >99%) requires hospitals to allocate a disproportionate amount of time, money, and people to increase very small increments of compliance. I sat with a SCIP Nurse Abstractor last week and we spent 90 minutes drawing out, on 2 consecutive white boards, the entire process from start to finish of reporting SCIP core measures. There are over 50 steps, 5 spreadsheets/files, 4 hand-offs, 3 committees, and a partridge in a pear tree. It takes 2.5 months. I wonder how much money that is if you were to translate that time and effort into hard money spent? I also wonder what the return on investment is for that time, effort, and money. If we’re going to start running healthcare like a business, which I argue we should, this seems like a great place to start.

STEP 1: Reduce the amount of time spent on this process by ensuring the data is trustworthy There are way too many “validation” steps. Most people do not trust the data they’re given, and therefore end up re-validating according to their own unique way of massaging the data.

STEP 2: Integrate data from multiple sources so your Quality Abstractors and Analysts aren’t searching in 10 different places for the information they need. I’m currently helping a client implement interfaces for surgery, general lab, microbiology, blood bank, and pharmacy into their quality reporting system so their analysts can find all the information they need to report infection rates, core measures, and patient safety metrics. In addition, we built a Business Objects universe on top of the quality data store and they can do dynamic reporting in near real time. The amount of time saved is amazing and we have been successful in dramatically shifting the type of work these people are responsible for. The BI Capability Maturity Model below depicts our success helping them move from left to right.

STEP 3: Empower your analysts. With much more time to actually analyze the information, these people are the best candidates to help find errors in the data, delays in the process, and opportunities for improvement.

STEP 4: Create a mechanism for feedback based on the information you uncover. Both overachievers and underperformers alike need to be recognized for the appropriate reasons. Standardize on the best of what you find, and be sure to localize your intervention where the data is inaccurate or the process breaks down. This will also demonstrate greater transparency on your part.

Strategic Finance for Service Lines: Finding Opportunities for Growth

Healthcare providers are always seeking innovations and evaluating strategic alternatives to meet growing demand while healthcare legislation is adding challenges to an already complex industry. As the population continues to age and development increases the demand for high quality healthcare, providers must put themselves in the optimal financial position to deliver the best care to the communities that depend on them.

To do this, many are turning to a service line model so that they can identify profitable areas of their organization that will generate future growth and capture market share.  In order to identify the strategic value of each service line, organizations need to have a long-range planning tool that will enable them to quickly forecast each of their service lines over the next 3-5 years and evaluate growth areas so that investments can be made in the service lines that will generate the greatest long-term economic value for the organization.

Utilizing Oracle’s Hyperion Strategic Finance, Edgewater Ranzal has helped many organizations chart a realistic financial plan to achieve their long-range goals and vision.  Some of the ways that we have helped organizations are as follows:

  • Forecast detailed P&Ls  for each service line using revenue and cost drivers such as number of patients, revenue per procedure, FTE’s, and payer mix to accurately forecast profit levels of each service line.
  • Easily consolidate the forecasted service line P&Ls to view the expected financial results at a care center level or for the Healthcare organization as a whole.
  • Layer into the consolidation structure potential new service lines that are being evaluated to understand the incremental financial impact of adding this new service line.
  • Run scenarios on the key business drivers of each service line to understand how sensitive profitability, EPS, and other key metrics are to changes in variables like number of patients, payer mix, FTE’s and salary levels.
  • Compare multiple scenarios side by side to evaluate the risks and benefits of specific strategies.
  • Evaluate the economic value of large capital projects needed to grow specific service lines beyond their current capacity.  Compare the NPV and IRR of various projects to determine which ones should be funded.
  • Layer into the consolidation structure specific capital projects and view their incremental impact on revenue growth and profitability at the service line level as well as the healthcare organization as a whole.
  • Use the built in funding routine of HSF to allocate cash surpluses to new investments and to analyze at what point in time the organization is going to need to secure more debt financing to fund its operations and its capital investments in specific service lines.

Regardless of where you are in your understanding, analysis, or implementation of service lines, a viable long-term strategy must include a critical evaluation of how will you identify the market drivers for growth, measure sustainable financial success, and adjust to changing economic, regulatory, and financial conditions.

So You’ve Signed an Alternative Quality Contract (AQC), Now What?

There have been headlines recently announcing the partnerships BCBSMA has struck with MA-based providers called Alternative Quality Contracts that seek to flip the traditional fee-for-service model on its head. Instead, these contracts advocate payments for the highest quality of care, not necessarily the most expensive. BCBSMA boasts, “The AQC now includes 23% of physicians in their network, providing care to 31% of their MA-based HMO members”. There are some recognizable names on the list of organizations already signed up including Mount Auburn, Tufts, Atrius and Caritas Christi among others totaling 9 hospitals and physician groups. I would argue, though, that signing the contract is the easiest part to this partnership (even though the lawyers for both sides may disagree). The foundation of the contracts is the quality measures that both parties agree represent the best way to monitor the quality of the care being delivered. The hard part is finding the most efficient ways to calculate and report the measures to save money and improve revenue from this new model for reimbursement. BCBSMA explicitly states, “Providers can retain margins derived from reduction of inefficiencies.” Well where are the biggest opportunities for “reducing inefficiencies”? The biggest inefficiencies lie in the collection, aggregation, and reporting of the data required to calculate these quality measures!

Almost every hospital I’ve worked, volunteered, or consulted for has the same problem – it takes too long to calculate and report quality metrics. Paper-chasing and manual chart abstraction burden overqualified nurses and delay decision makers from improving their processes. By the time the information is collected, it’s too late to make an actionable decision to either improve or standardize the best practice. Hospitals will clamor, “Our core measures are above 90%”. My response is always, “ok but you just reported June’s numbers and it’s August! And how much did it cost you to report those numbers that you now can’t do much with because you’re two months behind the information?”

In Massachusetts, the public reporting of physician and hospital performance is currently being developed. This trend will undoubtedly spread to the rest of the country, eventually. The demand for transparency of true healthcare costs and quality of care, I’d argue, has never been higher. The AQCs are a great first step in the right direction. But until hospitals and physician groups tackle the bigger problem, the huge amount of time, money and effort currently being wasted on paper-chasing and manual chart abstraction required to collect, calculate, and report their quality metrics, these contracts will only be yet another piece of paper to keep track of. Good intentions don’t equate to good business.

Healthcare Analytics – A Proven Return on Investment: So What’s Taking So Long?

So what do you get when you keep all your billing data in one place, your OR management data in another, materials management in another, outcomes and quality in another, and time and labor in yet another? The answer is…………..over 90% of the operating rooms in America!

That’s right; the significant majority of operating rooms DO NOT have an integrated data infrastructure. In the simplest terms, that means that the average OR Director/Administrator CAN’T give answers to questions like, “of all orthopedic surgeons performing surgery within your organization (single or multi-facility), what surgeon performs total knee replacements with the lowest case duration, least number of staff, lowest rate of complication, infection, and re-admission rate at the lowest material and implant cost with the highest rate of reimbursement?” In other words, they can’t tell you who their highest quality, most profitable, least risky, least costly, best performing surgeon is in their highest revenue surgical specialty. Yes, I’m telling you that they can’t distinguish the good from the bad, the ugly from the, well, uglier, and the 2.5 star from the 5 star. Are you still wondering why there is such a strong push for transparency of healthcare data to the average consumer?

You’re sitting there asking yourself, “Why can’t they answer questions like whose the least costly, most profitable and highest quality surgeon?” and the answer is simple, “application-oriented analysis”. Hospitals have yet to realize the benefits of healthcare analytics. That is, the ability to analyze information that comes from multiple sources in one location, instead of trying to coordinate each individual system analyst and have them hand their spreadsheet off to the other analyst that then adds in her data and massages it just right to hand it off to the next guy, and then….ugh you get the point. If vendors like McKesson, Cerner, and Epic could make revenue off of sharing data and “playing well with others” they would, but right now they don’t. They make their money off of deploying their own individual solutions that may or may not integrate well with other applications like imaging, labs, pharmacy, electronic documentation, etc. They will all tell you that their systems integrate, but only once you’ve signed their contract and read that most of the time, it requires their own expertise to build interfaces, so you’ll need to pay for one of their consultants to come do that for you – just ask anyone who has McKesson Nursing Documentation how long it takes to upgrade the system or how easy it is to integrate with their OR Management system so floor nurses can have the data they need on their computer screen when the patient arrives directly from surgery. Out of the box integrated functionality/capability? Easy-to-read, well documented interface specifications that a DBA or programmer could script to? Apple plug-n-play convenience? Not now, not in healthcare.

Don’t get too upset though, there are plenty of opportunities to fix this broken system. First, understand that organizations such as Edgewater Technology have built ways to integrate data from multiple systems and guess what – we integrated 5 OR systems in a 7 hospital system and they saved $5M within the first 12 months of using the solution, realizing a ROI 4 times their original cost.  Can it be done? We proved it can. So what is taking so long for others to realize the same level of cost savings, quality improvement and operational efficiency? I don’t know, you tell me? But don’t give me the “it’s not on our list of top priorities this year” or the “patient satisfaction and quality mandates are consuming all our resources” or don’t forget the “we’re too busy with meaningful use” excuses. Why? Because all of these would be achievable objectives if you could first understand and make sense of the data you’re collecting outside the myopic lens you’ve been looking through for the past 30 years. Wake up! This isn’t rocket science, we’re trying to do now what Gordon Gekko and Wall Street bankers were doing in the 80’s – data warehousing, business intelligence, and whatever other flashy words you want to call it – plain and simple, it’s integrating data to make better sense of your business operations. And until we start running healthcare like it’s a business, we’re going to continue to sacrifice quality for volume. Are you still wondering why Medicare is broke?