Epic Clarity Is Not a Data Warehouse

It’s not even the reporting tool for which your clinicians have been asking!

I have attended between four and eight patient safety and quality healthcare conferences a year for the past five years. Personally, I enjoy the opportunities to learn from what others are doing in the space. My expertise lies at the intersection of quality and technology; therefore, it’s what I’m eager to discuss at these events. I am most interested in understanding how health systems are addressing the burgeoning financial burden of reporting more (both internal and external compliance and regulatory mandates) with less (from tightening budgets and, quite honestly, allocating resources to the wrong places for the wrong reasons).

Let me be frank: there is job security in health care analysts, “report writers,” and decision support staff. They continue to plug away at reports, churn out dated spreadsheets, and present static, stale data without context or much value to the decision makers they serve. In my opinion, patient safety and quality departments are the worst culprits of this waste and inefficiency.

When I walk around these conferences and ask people, “How are you reporting your quality measures across the litany of applications, vendors, and care settings at your institution?,” you want to know the most frequent answer I get? “Oh, we have Epic (Clarity)”, “Oh, we have McKesson (HBI),” or “Oh, we have a decision support staff that does that”. I literally have to hold back a combination of emotions – amusement (because I’m so frustrated) and frustration (because all I can do is laugh). I’ll poke holes in just one example: If you have Epic and use Clarity to report here is what you have to look forward to straight from the mouth of a former Epic technical consultant:

It is impossible to use Epic “out of the box” because the tables in Clarity must be joined together to present meaningful data. That may mean (probably will mean) a significant runtime burden because of the processing required. Unless you defer this burden to an overnight process (ETL) the end users will experience significant wait times as their report proceeds to execute these joins. Further, they will wait every time the report runs. Bear in mind that this applies to all of the reports that Epic provides. All of them are based directly on Clarity. Clarity is not a data warehouse. It is merely a relational version of the Chronicles data structures, and as such, is tied closely to the Chronicles architecture rather than a reporting structure. Report customers require de-normalized data marts for simplicity, and you need star schema behind them for performance and code re-use.”

You can’t pretend something is what it isn’t.

Translation that healthcare people will understand: Clarity only reports data in Epic. Clarity is not the best solution for providing users with fast query and report responses. There are better solutions (data marts) that provide faster reporting and allow for integration across systems. Patient safety and quality people know that you need to get data out of more than just your EMR to report quality measures. So why do so many of you think an EMR reporting tool is your answer?

There is a growing sense of urgency at the highest levels in large health systems to start holding quality departments accountable for the operational dollars they continue to waste on non-value added data crunching, report creation, and spreadsheets. Don’t believe me? Ask yourself, “Does my quality team spend more time collecting data and creating reports/spreadsheets or interacting with the organization to improve quality and, consequently, the data?”

Be honest with yourself. The ratio, at best, is 70% of an FTE is collection, 30% is analysis and action. So – get your people out of the basement, out from behind their computer screens, and put them to work. And by work, I mean acting on data and improving quality, not just reporting it.

BIG DATA in Healthcare? Not quite yet…

AtlasLet’s be honest with ourselves. First –

“who thinks the healthcare industry is ready for Big Data?”

Me either…

Ok, second question,

“who thinks providers can tackle Big Data on their own without the help of healthcare IT consulting firms?”

Better yet,

“can your organization?”

Big data” seems to be yet another catch phrase that has caught many in healthcare by surprise. They’re surprised for the same reason I am which was recently summed up for me by a VP of Enterprise Informatics at a 10 hospital health system – “how can we be talking about managing big data when very few [providers] embrace true enterprise information management principles and can’t even manage to implement tools like enterprise data warehouses for our existing data?” Most people in healthcare who have come from telecommunications, banking, retail, and other industries that embraced Big Data long ago agree the industry still has a long way to go. In addition vendors like Informatica who have a proven track record of helping industries manage Big Data with their technology solutions, still have yet to see significant traction with their tools in healthcare. There are plenty of other things that need to be done first before the benefits of managing Big Data come to fruition.

Have we been here before? Didn’t we previously think that EMR’s were somehow going to transform the industry and “make everything simpler” to document, report from, and analyze? Yes we now know that isn’t the case, but it should be noted that EMR’s will eventually help with these initiatives IF providers have an enterprise data strategy and infrastructure in place to integrate EMR data with all the other data that litters their information landscape AND they have the right people to leverage enterprise data.

Same can be said of Big Data. It should be relatively easy for providers to develop a technical foundation that can store and manage Big Data compared to the time and effort needed to leverage and capitalize on Big Data once you have it. For the significant majority of the industry the focus right now should be on realizing returns in the form of lower costs and improved quality from integrating small samples of data across applications, workflows, care settings, and entities. The number of opportunities for improvement in the existing data landscape with demonstrable value should be top priority to mobilize stakeholders to action. Big Data will have to wait…for now.

Healthcare’s Conundrum: (IN)Decision by Committee – Good at Making Friends, NOT at Making Progress

I should start by mentioning the fact that I clearly hit a nerve on my last blog post about the huge cost “Decision by Committee” adds to the healthcare system. People agree with me, yet are hesitant about being as straightforward as I was….so be it.

Having said that, I should be straightforward about my next point – “decision by committee” impedes progress. If you know Moore’s law, or have seen the new Best Buy commercial about the “outdated world” (which I must admit is funny) you know that technology advances very quickly. Not just in retail or gaming and entertainment, but in almost every industry. Therefore, healthcare executives are inherently doing themselves a disservice by delaying their technology upgrade and new purchasing decisions. This problem isn’t restricted to just hardware and software either, but integration technology (SQL Server), business rules engines, data warehousing, knowledge management sites (SharePoint), patient relationship management applications (Microsoft CRM), patient portals, etc. By the time an organization identifies the need for new technology they have a short window to capitalize on the benefits without sacrificing some of the downsides of waiting to implement. Whether the driver is to achieve a competitive advantage, meet the demands of an evolving market place, comply with regulations, or satisfy individual stakeholders, they all would benefit from a faster implementation schedule. So why does everything take so long?

Everyone knows time is money. The problem is no one is cognizant of the opportunity cost associated with delayed and prolonged decision making. They think the money clock starts ticking once the project starts. What an outdated way of managing! The clock starts ticking as soon as you’re organization has agreed that the need exists and you need to find someone or something to meet it! This isn’t rocket science people.

“Progress” in the context of this blog is when healthcare finally starts to achieve the efficiencies from utilizing IT that retail, banking, and even life sciences did 20 years ago. The main point we should all agree on: “healthcare should be run like a business” and the last two blogs I’ve written speak directly to this. If for some reason you think this is a bad idea because “it takes away from the focus on the patient” then stop reading because I know you don’t work in healthcare or understand where the inefficiencies in the system lie and we shouldn’t be talking anyway.

Unfortunately, efficient and appropriate decision making is an important organizational component that is not characteristic of large committees in healthcare organizations.  There is typically a concern that too much risk may be made that could compromise patient care or safety.  However the opportunity lost with indecision may be as much or more costly.

Picis Exchange Global Customer Conference – “It’s All About the Data”

The Picis Exchange Global Customer Conference went off without a hitch last week in Miami. The main information sessions were categorized by the four areas of a hospital Picis specializes in: Anesthesia and Critical Care, Emergency Department, Perioperative Services, and Revenue Management Solutions (via its acquisition of LYNX Medical Systems). I was able to attend a number of sessions, network with both the company and its customers, and hear what the top priorities for this diverse group are over the next few years. As I reviewed my notes this weekend, thinking back to all the conversations I had with OR Directors, Quality Compliance Managers, Clinical Analysts, Billing and Coding Auditors, Anesthesiologists, and IS/IT Directors, one theme emerged – it’s all about the data!

The most frequent discussions centered around a few major challenges the healthcare industry, not just Picis clients, must deal with in the coming months and years. These challenges vary in complexity and impact on the 5 P’s [Patients, Providers, Physicians, Payers, and Pharmaceutical Manufacturers]. Picis customers and users, who collect, analyze, present and distribute data most efficiently and effectively related to the following challenges, position themselves as stable players in an increasingly turbulent industry:

  • Meaningful Use – “What data must I show to demonstrate I’m a meaningful user of Healthcare IT to realize the greatest number of financial incentives available? How can I get away from free-text narrative documentation and start collecting discrete data in anticipation of the newly announced HIMSS Analytics expanded criteria?
  • Quality & Regulatory Compliance – “How can I improve my quality metrics such as Core Measures and keep them consistently high over time? How can I reduce the amount of time it takes for me to report my data? How can I improve my data collection, analysis, and presentation to enable decision makers with actionable data?”
  • ICD-9 to ICD-10 Conversion – “What data and processes must I have in place to demonstrate use of ICD-10 before the looming deadline? Is my technical landscape integrated and robust enough to handle the dramatic increase in ICD-10 codes? Does my user community understand the implications of the changes associated with this conversion?”
  • Resource Productivity – “How can I reduce the amount of time my staff spends chasing paper, manually abstracting charts, and analyzing free-text narrative documentation? What percentage of these processes can I automate so my staff is focused on value-added tasks?”
  • Revenue Cycle Improvement & Cost Transparency – “How can I integrate my clinical, operational, and financial data sets to understand where my opportunities are for enhanced revenue? How can I standardize these as best practices? Can I cut costs by reducing inventory on hand and redundant vendor/supply contracts or by improving resource utilization and provider productivity? How will this impact patient volume? Am I prepared for healthcare reforms’ “call for transparency?”

All of these challenges, although unique, have fundamental components in common that must be established before any progress is made. Each instance requires that processes are established to standardize the collection of data to ensure accuracy and consistency so users can “trust the data”. A “single version of the truth” is essential; without this your hospital will continue to be pockets of siloed expertise lying in Excel spreadsheets and Access databases (best case), or paper charts and scanned documents (worst case) that are laboriously re-validated at every step in the information lifecycle.

Picis did a wonderful job of reinforcing its commitment to its customer base. It promised improved product features, more intuitive user interfaces, an enhanced user community for collaboration and idea sharing, and more opportunities for training. Fundamentally, Picis is a strong player in a market that seems ripe for consolidation and its potential for growth is very high. Yet, Picis will always be just that, a product company. The healthcare industry no doubt needs strong products such as Picis to drive critical operations, and collect the data necessary for improved decision making and transition from paper to automation. But Picis acknowledged, through its evolving collaboration with partners such as Edgewater Technology that understand both the technical landscape and clinical domain, that the true spark for change will come when the people and processes align with these products more effectively. This combination will be the foundation for a heightened level of care from an integrated data strategy that propagates a formula of superior patient outcomes from every dollar spent.

Move the Quality Focus to Patient Outcomes

I had the privilege to attend the Microsoft Connected Health Conference in Bellevue, Washington on May 19-20.  Microsoft changed the format of their education sessions this year to a panel discussion including short presentations.  This new format included a moderator and several views of the topic from industry experts and key people from healthcare organizations.  One of my favorite sessions was titled “Capturing Value Across the Continuum: Healthcare Quality and Outcomes.”  If you have been following the Edgewater blogs on improving Core Measures then you understand my interest.

The real take-away on this topic was the understanding that the focus on quality in healthcare has been centered more on improving business processes than improving patient outcomes.  The panel consisted of Kim Jackson, Director of Data Warehousing, St. Joseph Health System, Kevin Fahsholtz, Senior Director with Premier, Dr. Floyd Eisenberg, Senior Vice President for Health Information Technology at the National Quality Forum and Dr. Richard Chung of the Hawaii Medical Services Association.  The panel represented a Hospital Provider (Kim), an Analytics and Benchmarking company (Kevin), a healthcare standards organization (Dr. Eisenberg) and a Payer organization (Dr. Chung),all of the key aspects of the Healthcare Quality continuum and was focused on the real world challenges of improving the quality of healthcare.

The key idea of improving patient outcomes dominated the hour long discussion.  Kim White noted that “the burden (of collecting data) for a hospital is overwhelming, and measuring is overtaking the work.”  Dr. Eisenberg agreed that there was a need to move the focus of the quality measures to outcomes and away from the small process details.  He went on to say that the real issue is the definitions of the data and that the definitions need to be standardized.  In his role, Dr. Eisenberg is working to create a standard data model for quality measures and key definitions to the standards for care.  Dr. Chung pointed out that we need to change our “culture of care delivery” along with the awareness of the data.  Dr. Chung believes that providing visibility of the quality data helps set up a culture of change.  His experience shows that separating the data from the application software allows new understanding.
All of the panelists agreed that a key issue is developing the “single version of the truth” and eliminating conflicting information.  Kim White presented that using Microsoft’s Amalga UIS product allowed St. Joseph Health System to unite their data, reorganize their data and prioritize it.  She pointed out that consolidating data sources from eight locations created this “single version of the truth” and reduced the administrative burden for tracking core measures.

Our experience in improving core measures parallels this panel discussion.  Success in improving healthcare quality and outcomes involves plain old hard work – collecting the right data, with the right definition, at the right time in the process and providing it to the right people.  The need to extend the data collection to tracking outcomes beyond reporting requirements is the right idea at a right time in healthcare.  Let’s not settle for the minimal reporting requirements, but truly track outcomes and develop the feedback loops necessary to keep them successful and improving.  It is, after all, about the patient and not mere statistics.

Analyzing Clinical Documentation Requires Discrete Data

How many of your patients’ paper medical charts look something like this?  How many similar piles are on the front desk of the OR? The PACU managers office? The scheduling department? Your office?

I know it’s not pretty, it’s legible…barely, it’s written free hand, it’s clunky, it’s outdated, it’s like hearing your favorite song on an 8-track or cassette tape, it’s simply a thing of the past. Oh, and it takes a lot of time which means it costs a lot of money.

Doctors spend a lot of time and money going to school to become experts on the human body – that’s who I want taking care of me. Unfortunately, they are burdened by a system that requires they write specific phrases, terms, and codes just to get paid essentially becoming experts in understanding a set of reimbursement business rules – that’s not who I want taking care of me. Healthcare is an industry that’s core infrastructure, its backbone of information centered on diagnosis, procedure, and other treatment and care delivery codes, is broken. Why? Because all of that information is currently written down – not electronic!

I’m prepared to help fix a broken system. I have personally seen over 100 different ways for a physician to write down their observation after a routine visit with a patient. This includes the phrasing of the words, penmanship/legibility, abbreviations (only officially “accepted” abbreviations though), and interpretation.  The same thing goes for an appendectomy, blood work, an MRI, and an annual physical. This is unacceptable. The important information that a physician records must be entered as discrete data elements directly into a computer. This means that each piece of data has its own field – sorry circulating nurses who love free-text “case notes” sections at the end of surgery – and the time of free text and narrative documentation is over. Do you know how much time and money can be saved by avoiding the endless paper chasing and manual chart abstraction? Me either, but I know it’s a lot!

How do you fix it? I’m not going to lie and tell you it’s easy. Governance helps. You can guarantee that surgeons, anesthesiologists, hospitalists, specialists and the rest will all have their needs and comforts…and opinions. “If you want to perform surgery at this facility you need to document your information discretely, electronically, consistently and in a timely fashion.” Physicians are used to writing stuff down, its familiar, its comfortable, it’s home cooking. In order to change that comfortable behavior you must emphasize the benefits:  they will spend less time documenting, they will have faster clinical decision support, they will have automated and timely reporting capabilities, they will have near real time feedback on their performance, benchmarks against best standards, and opportunities for improvement. Doctors can appreciate an investment in an evidence-based approach. In order to automate the collection, reporting, and analysis of the mountain of information collected every day, on every patient, in every part of the hospital, it must be entered discretely. That or you waste more time and money than your competitor who just went all electronic. Do you really want to control costs and get paid faster?  Stop using paper and join the 21st century!

Implementing Healthcare Service Lines – Gaining Competitive Advantage by Focusing and Optimizing the Enterprise Mission

Many hospitals and health systems are exploring integrated Service Lines as a novel organizational model to help bring focus to the diverse services they provide to patients and the various resources (labor, facilities, equipment, materials) they must manage in their relentless striving for ever higher quality and lower cost.

The notion of a Service Line is certainly not new, and there are countless examples from other industries where firms have made the transition from a “vertical” functional model, where resources are organized according to individual professional disciplines, to a more “horizontal” service line or product line model, where resources are directly aligned with the “outputs” that are produced and delivered.  Any one of us can easily call to mind large enterprises in many different industries whose product lines have a stronger identity than the overall firm itself.  However, to be clear, the primary emphasis in our present context is on the manner in which the diverse resources of the firm are organized to most effectively and efficiently deliver the products and services of the enterprise.  Identity and/or brand equity are simply one of the assets that is managed.

In the transition to a Service Line orientation, healthcare providers will very often move through a sequence of inquiries as they incrementally define and implement the emerging model, tailoring it to their unique and specific circumstances, challenges and opportunities.

Motivation –

  • Why would we want to move to a Service Line model?
  • What is it about our specific circumstances that suggest a Service Line model would be superior?
  • What advantages would it bring?  What difficulties would it cause, or exacerbate?
  • What are our unique opportunities or capacities in terms of clinical, operational or financial excellence, or challenge, that suggest Service Lines as a way to proceed strategically?
  • Do we have best-in-industry care models, or process excellence, or evidence of best results?
  • What other organizations are like us, what have they done, and what can we learn from them?
  • What differentiates our particular strengths, weaknesses, opportunities or threats from existing experiences or illustrations?

Definition –

  • What is the logical and most motivating model for defining our Service Lines?
  • Can we best approach the universe of patient needs by primary disease type?
  • Are we best aligned by major service or treatment modality, facilities, differentiating equipment or other assets?
  • Is a combination of these best; aligned and integrated for each patient in a context of personalized medical care?
  • Are there specific populations of patients, defined along some useful dimensions, to inform our service delivery or resource deployment?

Specification –

  • How do we explicitly assign or allocate individual delivery events into one or another Service Line?
  • What is the fundamental unit of assignment (e.g. a single case; a single encounter; an episode spanning admissions; longitudinal courses of care)?
  • How do we address conflicts or overlap, where multi-disciplinary care or complex cases suggest more than one Service Line has a legitimate claim?
  •  How are the resource and financial alignments defined and measured?

Metrics –

  • What specific metrics do we use to direct our investments and measure the performance of those resources and the return on those assets?
  • How do we devise a metrics strategy that integrates and leverages the complementary perspectives of clinical, operational, financial and administrative objectives?
  • How does our metrics structure address and accommodate the structure of our enterprise (e.g. health system, hospitals, ambulatory groups, key stakeholders)?
  • How do we assess comparative analysis and performance, both internally and externally?
  • What are appropriate and useful benchmarks?

Organization –

  • How do we organize our Service Lines?
  • What model do we adopt (formal divisional, matrix, task force, hybrid)?
  • How, in what form and to whom do we institute roles, authority, and accountabilities, and for what scope of mission and resources?
  • How do we ensure clinical, operational and fiscal perspectives and guidance?
  • What differences are necessary within or between facilities and other entities?
  • What communications and information systems are needed?
  • What incentive structures and programs are needed?

Change Management –

  •  Where and in what form will we encounter resistance?
  • What obstacles does our plan anticipate, and how will we overcome them?
  • What limitations exist within our existing enterprise structure or resource base, and how must we address them?
  • What forces are working in favor of the needed change, and how can we invigorate those efforts?

Ongoing Management –

  • How will or must the focus, structures, resources and metrics change over time, and how do we anticipate and facilitate these challenges and opportunities in an evolutionary manner?
  • How should we characterize the various individual Service Lines from a portfolio perspective; and how do we establish and implement appropriate investment strategies and accountabilities for individual Service Lines at their position in their life cycle and in the overall portfolio?
  • What benchmarks or metrics can we adopt to enable an appropriate comparative assessment, both within the overall enterprise portfolio, and in our specific competitive context?

Building out integrated Service Lines is more than simply clarifying the operations of existing departments and specialty-focused medical services.  Trying to do better with a traditional alignment by professional specialty or discipline will likely not provoke the kinds of challenges to existing authority structures, realignment of complex multi-tiered cost allocations, and performance incentives needed to bring transformational improvements in quality and financial return.  The process must begin with a laser focus on patient outcomes and experience, and propagate backward to the fresh look that is required to evaluate the needed investments in the full context of distinctive strengths and competencies, competitive threats, market position, evidence of superior (actual or prospective) performance, demonstrable resources and capacity, and the strategic commitment to make it happen.

Implementing advanced analytics positions healthcare organizations to pursue a vision to:

  • Facilitate measureable excellence in service delivery
  • Provide insights and identify opportunities for market growth, business performance improvement, competitive advantage and return on investment
  • Foster innovation in the design and pursuit of service lines and the efficient alignment of resources
  • Promote the intelligent and proactive use of existing and emerging information assets from diverse sources
  • Inform and empower leaders, strategists and decision makers at every level: enterprise, hospital, service line, care setting, department.

How does a data-driven healthcare organization work?

As the pressure increases for accountability and transparency for healthcare organizations, the spotlight is squarely on data: how does the organization gather, validate, store and report it.  In addition, the increasing level of regulatory reporting is driving home a need for certifying data – applying rigor and measurement to its quality, audit, and lineage.  As a result, a healthcare organization must develop an Enterprise Information Management approach that zeros in on treating data as a strategic asset.  While treating data as an asset would seem to be obvious given the level of IT systems necessary to run a typical healthcare organization, the explosion of digital data collected and types of digital data (i.e. video, digital photos, audio files) has overwhelmed the ability to locate, analyze and organize it.

A typical example of this problem comes when an organization decides to implement Business Intelligence or performance indicators with an electronic dashboard.  There are many challenges in linking data sources to corporate performance measures.  When the same data element exists in multiple places, i.e. patient IDs, encounter events, then there must be a decision about the authoritative source or “single version of the truth.” Then there is the infamous data collision problem: Americans move around and organizations end up with multiple addresses for what appears to be the same person, or worse yet, multiple lists of prescribed medications that don’t match.  The need to reconcile data discrepancies requires returning to the original source of information – the patient to bring it to a current status.  Each of us can relate to filling out the form on the clipboard in the doctor’s office multiple times.  Finally, there is the problem of sparseness – we have part of the data for tracking performance but we don’t have enough for the calculation.  This problem can go on and on, but it boils down to having the right data, at the right time and using it in the right manner.

Wouldn’t the solution simply be to create an Enterprise Data Warehouse or Operational Data Store that has all of the cleansed, de-duplicated, latest data elements in it?  Certainly!  Big IF coming up: IF your organization has data governance to establish a framework for audit-ability of data; IF your organization can successfully map source application systems to the target enterprise store; IF your organization can establish master data management for all the key reference tables; IF your organization can agree on standard terminologies, and most importantly, IF you can convince every employee that creates data that quality matters, not just today but always.

One solution is to understand a key idea that made personal computers a success – build an abstraction layer.  The operating system of a personal computer established flexibility by hiding the complexity of different hardware items from the casual user through a hardware abstraction layer that most of us think of as drivers.  A video driver, a CD driver, USB driver allows the modularity and allows flexibility to adapt the usefulness of the PC.  The same principle applies to data-driven healthcare organizations.  Most healthcare applications try to tout their ability to be the data warehouse solution.  However, the need for the application to improve over time introduces change and version control issues, thus instability in the enterprise data warehouse.  In response, moving the data into an enterprise data warehouse creates the abstraction layer and the extract, transform and load (ETL) process can act like the drivers in the PC example.  Then as the healthcare applications move through time, they do not disrupt the Enterprise Data Warehouse, its related data marts and, most importantly, the performance management systems that run the business.  It is not always necessary to move the data in order to create the abstraction layer, but there are other benefits to that approach including the retirement of legacy applications.

In summary, a strong data-driven healthcare organization has to train and communicate the importance of data as a support for performance management and get the buy-in from the moment of data acquisition through the entire lifecycle of that key data element.  The pay-offs are big: revenue optimization, risk mitigation and elimination of redundant costs.  When a healthcare organization focuses on treating data as a strategic asset, then it changes the outcome for everyone in the organization, and restores trust and reliability for making key decisions.

It’s our data that’s hurting us! – Transparency in a consumer-oriented healthcare marketplace

The Internet is really enabling the healthcare consumer to shop and compare like never before.  Why would there be a need to shop for the better hospital, doctor or nursing home? The need to shop for quality care has never been more important and the competition between hospitals and other healthcare providers is heating up.  The consumer wants the best surgeon to do their procedure, in the safest hospital and they are turning to healthcare rating sites in record numbers.  The irony of rating sites is that they are dependent on data that is provided by the hospitals, doctors and other allied providers – it is their very own data that they are being judged by.

Today, it is easy for the consumer to Google “compare doctors” or “compare hospitals” and locate numerous websites with detailed information for comparisons.  Two notable examples are leapfroggroup.com and ucomparehealthcare.com.  Leapfrog does not just rely on publicly reported data from regulatory agencies but extends the information with detailed surveys of hospitals on the key issues: central line infections and infection control, for example.  One comparison website reports that the Top 5% of its reporting hospitals have a 29% lower mortality rate.

No individual or healthcare organization wants an unfair report card.  With medical mistakes as a leading cause of death each year surpassing car accidents, breast cancer and AIDS, the report card also serves healthcare organizations as guidance on critical areas of improvement.  The process to collect regulatory reporting information in many healthcare organizations is tedious, time-consuming and often manual.  In the classic sense of “we have the data somewhere but not in the format that we need it.”  There are several key problems with collecting core measures and other key metrics for reporting:

  • Key data elements for the calculations are paper-based or manually compiled
  • Manual process fatigue from paper form processing
  •  Automated reporting systems use sample patient populations that are too small resulting in a possible statistical errors
  • Errors in the data transfer process, especially in the hand-off of information from one area of the hospital to another skew results
  • Inability to track a diagnosis code early enough in the patient encounter to improve on the measure outcomes
  • Lack of staff training on collecting the right information at the right time in the right format

Consumers use the reported results to compare hospital performance and make decisions about where to receive care.  As a result, healthcare organizations need to focus on data governance to address treating data as an asset, ensuring data quality and tracking the right key metrics.  Addressing this challenge will not only improve the ratings report card for healthcare organizations but will demonstrate the commitment to quality data as well as patient safety.  Better data equals better results.  In the consumer-oriented healthcare marketplace, transparency of key metrics will yield competitive advantage.

Driving Value from Your Healthcare Analytics Program –Key Program Components

If you are a healthcare provider or payer organization contemplating an initial implementation of a Business Intelligence (BI) Analytics system, there are several areas to keep in mind as you plan your program.  The following key components appear in every successful BI Analytics program.  And the sooner you can bring focus and attention to these critical areas, the sooner you will improve your own chances for success.

Key Program Components

Last time we reviewed the primary, top-level technical building blocks.  However, the technical components are not the starting point for these solutions.  Technical form must follow business function.  The technical components come to life only when the primary mission and drivers of the specific enterprise are well understood.  And these must be further developed into a program for defining, designing, implementing and evangelizing the needs and capabilities of BI and related analytics tuned to the particular needs and readiness of the organization.

Key areas that require careful attention in every implementation include the following:

We have found that healthcare organizations (and solution vendors!) have contrasting opinions on how best to align the operational data store (ODS) and enterprise data warehouse (EDW) portions of their strategy with the needs of their key stakeholders and constituencies.  The “supply-driven” approach encourages a broad-based uptake of virtually all data that originates from one or more authoritative source system, without any real pre-qualification of the usefulness of that information for a particular purpose.  This is the hope-laden “build it and they will come” strategy.  Conversely, the “demand-driven” approach encourages a particular focus on analytic objectives and scope, and uses this focus to concentrate the initial data uptake to satisfy a defined set of analytic subject areas and contexts.  The challenge here is to not so narrowly focus the incoming data stream that it limits related exploratory analysis.

For example, a supply-driven initiative might choose to tap into an existing enterprise application integration (EAI) bus and siphon all published HL7 messages into the EDW or ODS data collection pipe.  The proponents might reason that if these messages are being published on an enterprise bus, they should be generally useful; and if they are reasonably compliant with the HL7 RIM, their integration should be relatively straightforward.  However, their usefulness for a particular analytic purpose would still need to be investigated separately.

Conversely, a demand-driven project might start with a required set of representative analytic question instances or archetypes, and drive the data sourcing effort backward toward the potentially diverging points of origin within the business operations.  For example, a surgical analytics platform to discern patterns between or among surgical cost components, OR schedule adherence, outcomes variability, payer mix, or the impact of specific material choices would depend on specific data elements that might originate from potentially disparate locations and settings.  The need here is to ensure that the data sets required to support the specific identified analyses are covered; but the collection strategy should not be so exclusive that it prevents exploration of unanticipated inquiries or analyses.

I’ll have a future blog topic on a methodology we have used successfully to progressively decompose, elaborate and refine stakeholder analytic needs into the data architecture needed to support them.

In many cases, a key objective for implementing healthcare analytics will be to bring focus to specific areas of enterprise operations: to drive improvements in quality, performance or outcomes; to drive down costs of service delivery; or to increase resource efficiency, productivity or throughput, while maintaining quality, cost and compliance.  A common element in all of these is a focus on process.  You must identify the specific processes (or workflows) that you wish to measure and monitor.  Any given process, however simple or complex, will have a finite number of “pulse points,” any one of which will provide a natural locus for control or analysis to inform decision makers about the state of operations and progress toward measured objectives or targets.  These loci become the raw data collection points, where the primary data elements and observations (and accompanying meta-data) are captured for downstream transformation and consumption.

For example, if a health system is trying to gain insight into opportunities for flexible scheduling of OR suites and surgical teams, the base level data collection must probe into the start and stop times for each segment in the “setup and teardown” of a surgical case, and all the resource types and instances needed to support those processes.  Each individual process segment (i.e. OR ready/busy, patient in/out, anesthesia start/end, surgeon in/out, cut/close, PACU in/out, etc.) has distinct control loci the measurement of which comprises the foundational data on which such analyses must be built.  You won’t gain visibility into optimization opportunities if you don’t measure the primary processes at sufficient granularity to facilitate inquiry and action.

Each pulse point reveals a critical success component in the overall operation.  Management must decide how each process will be measured, and how the specific data to be captured will enable both visibility and action.  Visibility that the specific critical process elements being performed are within tolerance and on target; or that they are deviating from a standard or plan and require corrective action.  And the information must both enable and facilitate focused action that will bring performance and outcomes back into compliance with the desired or required standards or objectives.

A key aspect of metric design is defining the needed granularity and dimensionality.  The former ensures the proper focus and resolution on the action needed.  The latter facilitates traceability and exploration into the contexts in which performance and quality issues arise.  If any measured areas under-perform, the granularity and dimensionality will provide a focus for appropriate corrective actions.  If they achieve superior performance, they can be studied and characterized for possible designation as best practices.

For example, how does a surgical services line that does 2500 total knees penetrate this monolithic volume and differentiate these cases in a way that enables usable insights and focused action?  The short answer is to characterize each instance to enable flexible-but-usable segmentation (and sub-segmentation); and when a segment of interest is identified (under-performing; over-performing; or some other pattern), the n-tuple of categorical attributes that was used to establish the segment becomes a roadmap defining the context and setting for the action: either corrective action (i.e. for deviation from standard) or reinforcing action (i.e. for characterizing best practices).  So, dimensions of surgical team, facility, care setting, procedure, implant type and model, supplier, starting ordinal position, day of week, and many others can be part of your surgical analytics metrics design.

Each metric must ultimately be deconstructed into the specific raw data elements, observations and quantities (and units) that are needed to support the computation of the corresponding metric.  This includes the definition, granularity and dimensionality of each data element; its point of origin in the operation and its position within the process to be measured; the required frequency for its capture and timeliness for its delivery; and the constraints on acceptable values or other quality standards to ensure that the data will reflect accurately the state of the operation or process, and will enable (and ideally facilitate) a focused response once its meaning is understood.

An interesting consideration is how to choose the source for a collected data element, when multiple legitimate sources exist (this issue spills over into data governance (see below); and what rules are needed to arbitrate such conflicts.  Arbitration can be based on: whether each source is legitimately designated as authoritative; where each conflicting (or overlapping) data element (and its contents) resides in a life cycle that impacts its usability; what access controls or proprietary rights pertain to the specific instance of data consumption; and the purpose for or context in which the data element is obtained.  Resolving these conflicts is not always as simple as designating a single authoritative source.

Controlling data quality at its source is essential.  All downstream consumers and transformation operations are critically dependent on the quality of each data element at its point of origin or introduction into the data stream.  Data cleansing becomes much more problematic if it occurs downstream of the authoritative source, during subsequent data transformation or data presentation operations.  Doing so effectively allows data to “originate” at virtually any position in the data stream, making traceability and quality tracking more difficult, and increasing the burden of retaining the data that originates at the various points to the quality standard.  On the other hand, downstream consumers may have little or no influence or authority to impose the data cleansing or capture constraints on those who actually collect the data.

Organizations are often unreceptive to the suggestion that their data may have quality issues.  “The data’s good.  It has to be; we run the business on it!”  Although this might be true, when you remove data from its primary operating context, and attempt to use it for different purposes such as aggregation, segmentation, forecasting and integrated analytics, problems with data quality rise to the surface and become visible.

Elements of data quality include: accuracy; integrity; timeliness; timing and dynamics; clear semantics; rules for capture; transformation; and distribution.  Your strategy must include establishing and then enforcing definitions, measures, policies and procedures to ensure that your data is meeting the necessary quality standards. 

The data architecture must anticipate the structure and relationships of the primary data elements, including the required granularity, dimensionality, and alignment with other identifying or describing elements (e.g. master and reference data); and the nature and positioning of the transformation and consumption patterns within the various user bases.

For example, to analyze the range in variation of maintaining schedule integrity in our surgical services example, for each case we must capture micro-architectural elements such as the scheduled and actual start and end times for each critical participant and resource type (e.g. surgeon, anesthesiologist, patient, technician, facility, room, schedule block, equipment, supplies, medications, prior and following case, etc.), each of which becomes a dimension in the hierarchical analytic contexts that will reveal and help to characterize where under-performance or over-performance are occurring.  The corresponding macro-architectural components will address requirements such as scalability, distinction between retrieval and occurrence latency, data volumes, data lineage, and data delivery.

By the way: none of this presumes a “daily batch” system.  Your data architecture might need to anticipate and accommodate complex hybrid models for federating and staging incremental data sets to resolve unavoidable differences in arrival dynamics, granularity, dimensionality, key alignment, or perishability.  I’ll have another blog on this topic, separately.

You should definitely anticipate that the incorporation and integration of additional subject areas and data sets will increase the value of the data; in many instances, far beyond that for which it was originally collected.  As the awareness and use of this resource begins to grow, both the value and sensitivity attributed to these data will increase commensurately.  The primary purpose of data governance is to ensure that the highest quality data assets obtained from all relevant sources are available to all consumers who need them, after all the necessary controls have been put in place.

Key components of an effective strategy are the recognition of data as an enterprise asset; the designation of authoritative sources; commitment to data quality standards and processes; recognition of data proceeding through a life cycle of origination, transformation and distribution, with varying degrees of ownership, stewardship and guardianship, on its way to various consumers for various purposes.  Specific characteristics such as the level of aggregation; the degree of protection required (e.g. PHI); the need for de-identification and re-identification; the designation of “snapshots” and “versions” of data sets; and the constraints imposed by proprietary rights. These will all impact the policies and governance structures needed to ensure proper usage of this critical asset.

Are you positioned for success?

Successful implementation of BI analytics requires more than a careful selection of technology platforms, tools and applications.  The selection of technical components will ideally follow the definition of the organizations needs for these capabilities.  The program components outlined here are a good start on the journey to embedded analytics, proactively driving the desired improvement throughout your enterprise.