Are you “ACO IT-Ready”?

First things first, I believe the push for accountable care is here to stay. I do not think that it is a fad that will come and go as many other attempts at healthcare reform have. Having said that, I also strongly believe that very few organizations are positioned to start realizing the benefits that will come from this reform any time soon. It’s not for lack of trying, as many organizations are already recognized as Pioneer ACO’s. But the hard part is not being established as an ACO – it’s proving you’re reducing costs and improving quality for targeted patient populations.

The first step will being January 1st, 2013. Some ACO’s will be required to start reporting quality measures – for instance the Shared Savings program from CMS for both the one-sided and two-sided models require reporting 33 quality measures. Notice I said “reporting”. So for the first year, it’s “pay for reporting”. Years 2 and 3 is when the rubber really meets the road and it becomes “pay for performance”. “Don’t just show me you are trying to reduce costs and improve quality, actually reduce and improve or realize the consequences.“

With ACO’s come reporting requirements. We in healthcare are used to reporting requirements. And those of us willing to publicly acknowledge it, more reporting means more waste. Why? Because there is job security in paying people to run around and find data…and to eventually do very little with it other than plug it in a spreadsheet, post it to a SharePoint site, email it to someone else, or well, you get my drift. Regardless of your view on these new requirements, they’re here to stay. So the $64,000 question is, are you ready to start reporting?

There is a wide range of both functional and technical requirements that healthcare providers and payers will need to address as they start operating as an ACO.  Many of the early and emerging ACOs have started the journey from a baseline of targeted patient panels to the optimized management of a population, progressing through a model with some or all of the following:

These are 7 simple questions you must be able to answer and report on DAY 1:

  1. Can you define and identify your targeted patient populations?
  2. Are you able to measure the financial and quality performance and risks of these patient panels and populations?
    1. Can you quickly, easily and consistently report quality and financial measures by Physician, Location, Service, or Diagnosis?
  3. Can you baseline your expenditures and costsassociated with various targeted patient populations?
    1. How will you benchmark your “before ACO” and “after ACO” costs?
  4. Can you accurately monitor the participation, performance and accountability of the ACO participants involved in coordinated, collaborative patient care?
  5. Will you be able to pinpoint where and when the quality of care begins to drift, so as to quickly intervene with care redesign improvements to limit the impacts on patients and non-reimbursable costs?
    1. Are you able to detect “patient leakage and provide your organization the information for its’ management? (Patient leakage is when a patient that you are treating as an ACO for a bundled payment, leaves the network for their care)
      1. Is a particular provider/provider group sending patients outside of the ACO?  If so, is it for a justified reason?
      2. Does the hospital need to address a capacity issue?
  6. Can you reconcile your internal costs of care with bundled reimbursements from payers?
  7. Are you positioned for population health management and achieving the Triple Aim on a continuing basis?

In order to answer these questions you must have a highly integrated data infrastructure. It seems I’m not the only one who agrees with this tactical first step:

  • The Cleveland Clinic Journal of Medicine agreed as it listed as one of its’ 5 Core Competencies Required to be an ACO “Technical and informatics support to manage individual and population data.”
  • Presbyterian Healthcare Services (PHS) has been a Pioneer ACO for over a year. Tracy Brewer, the lead project manager was recently asked by Becker’s Hospital Review, “What goals did you set as an ACO in the beginning of the year and how have you worked to achieve them” and her answer – “One of the major ones [goals] was updating our administrative and IT infrastructure. We had to make sure we had all the operational pieces in place to function as ACO. We also completed some work on our IT infrastructure so that once we received the claims data from CMS, we could begin analysis and really get value from it.”

The ACO quality measures require data from a number of different data sources. Be honest with me and yourselves, how confident are you that your organization is ready? Is your data integrated? Do you have consistent definitions for Providers, Patients, Diagnosis, Procedure, and Service? If you do, great you don’t have much company. If you don’t, rest assured there are organizations that have been doing data integration for nearly two decades that can help you answer the questions above as well as many more related to this new thing they call Accountable Care.

How to Optimize Microsoft Dynamics CRM 2011 Applications (Part 2)

In this blog, I will highlight changes to improve performance for Microsoft Dynamics CRM Customizations. Additional blogs will follow highlighting performance improvements to custom Microsoft Dynamics CRM SDK applications and Microsoft Dynamics CRM Reporting Services.

Microsoft Dynamics CRM Customizations Optimization

The following are guidelines to keep in mind when optimizing the performance of Microsoft Dynamics CRM customizations:

  • Carefully consider the possible effects on your organization’s business before removing or changing the order of records returned by a saved query. There may be important business reasons associated with the order of display in query results. If there are business reasons for changing, consider adding an index based on the new ordering to improve the performance of the query.
  • Use an iterative process to define which index best optimizes query performance. Test each index using a variety of selection criteria that may be common for the specific query. While one set of criteria may provide the projected performance increase from an index, different criteria may have no effect.

Consider these potential optimization techniques:

Disabling Auto-Complete on Lookups:

The Auto-Complete on Lookups functionality is enabled by default and while it can help to increase user efficiency, it can also affect overall performance and resource usage in a deployment of Microsoft Dynamics CRM 2011.  For optimal performance, consider disabling this functionality:

  • STEP 1: Log into Microsoft Dynamics CRM 2011 with a user that has a security role allowed to customize entities.
  • STEP 2: Proceed to Settings > Customize the System > Components > Entities > Entity >Forms.  In this case entity signifies the name of the entity to be modified and forms presents selecting the form that includes the field that needs to be modified.
  • STEP 3: Once in ‘Form Design’ view, find the field for which Auto-Complete is configured and then select ‘Change Properties’.
  • STEP 4: Once the ‘Field Properties’ window has appeared, proceed to the ‘Display’ tab and under ‘Field Behavior’ mark the check box to ‘Turn off Automatic Resolutions’ and select ‘OK’.
  • STEP 5: Repeat this process on each Entity/Field combination where the auto-complete feature needs to be disabled.

Querying on Custom Entities:

When adding custom attributes to the Microsoft Dynamics CRM system or custom entities, the columns for those attributes are included in an extension table in the SQL Server database instead of the entity’s base table.

When considering performance optimization of queries on custom entities, confirm that all columns on the ‘Order By’ clause derive from a single table, and create an index that satisfies the ‘Order By’ requirements and as much of the queries ‘WHERE’ clause selection criteria as possible. The process of creating the appropriate index will be iterative, however the performance benefits can be significant if implemented correctly.

To learn more about Edgewater’s CRM practice, click here.

Image courtesy of xkcd.com

How to Optimize Microsoft Dynamics CRM 2011 Applications (Part 1)

When considering optimal performance for Microsoft Dynamics CRM 2011, the following areas require attention:

  1. Microsoft Dynamics CRM Web Application
  2. Microsoft Dynamics CRM Customizations
  3. Custom Microsoft Dynamics CRM SDK Applications
  4. Microsoft Dynamics Reporting Services

In this blog, I will highlight changes to improve performance for the Microsoft Dynamics CRM Web Application.  Additional blogs will follow highlighting areas two, three and four.

Microsoft Dynamics CRM Web Application Optimization

The following are some simple changes to the out of the box configuration of Microsoft Dynamics CRM:

Setting the Default Views:

When starting Microsoft Dynamics CRM, viewing all records for an entity can be resource intensive, particularly as the size of the database increases. In order to improve system performance, the default views should be customized to limit the records that are displayed. For example, to see the default view for Contacts (assuming changes are made to the default solution):

  • STEP 1: On the Microsoft Dynamics CRM home page,
    • Select ‘Settings’;
    • Then under ‘Customization’, click ‘Customizations’;
    • Select ‘Customize the System’;
  • STEP 2: In the Solution window, under Components,
    • Select ‘Entities’;
    • Then select ‘Contact’;
    • Finally select Views;

NOTE: In the list of available views, the entry for the current default view is marked with a star. It displays as Default Public View.

  • STEP 3: Select the different view you that you want to set as default, then on the action toolbar,
    • Select ‘More Actions’;
    • Then select ‘Set Default’;
  • STEP 4: To save and publish the configuration changes, select ‘Publish All Customizations’ or under the Contact entity select ‘Publish’.

Quick Find Views (Limiting Search Columns):

The quantity of fields that are searched to display quick find results can directly affect performance.  For optimal performance, the quick find feature per entity should only be configured to search the fields that address specified business requirements. The following steps explain how to customize the ‘Quick Find’ feature (assuming changes are made to the default solution):

  • STEP 1: On the Microsoft Dynamics CRM home page,
    • Select ‘Settings’;
    • Then under ‘Customization’, click ‘Customizations’;
    • Select ‘Customize the System’;
  • STEP 2: In the Solution window, under Components,
    • Select ‘Entities’;
    • Then select the entity for the ‘Quick Find’ view needs to be customized;
    • Finally select Views;
  • STEP 3: In the list of views
    • Select the ‘Quick Find’ view;
    • Then on the action toolbar select ‘More Actions’;
    • Then select ‘Edit’;
  • STEP 4: In the ‘Quick Find’ view
    • Select ‘Add Find Columns’ located under ‘Common Tasks;
  • STEP 5: Select the fields that will be searched to provide Quick results and then select ‘Ok’.
  • STEP 6: Select ‘Save and Close’ in the ‘Quick Find’ window and then to save and publish the configuration changes, select ‘Publish All Customizations’ or under the selected entity select ‘Publish’.

If quick find results are slow to return with the simple changes described above, then database optimization techniques and tools should be leveraged with consideration to the following aspects:

  • The more columns included in a search, the longer the search will take.
  • Fields included in a search should be leading columns in indexes, even if this means creating one for each field. Also consider including in those indexes the Owner, BU, and the State fields, which are typically included in the query.
  • The use of filtered indexes can result in better query performance.

Best Practices:

The following are best practices to contemplate when considering optimal performance of CRM 2011:

  • Team Functionality: Teams can own records (objects) and records can be shared to teams thereby allowing members of that team access to the shared records.  Leveraging the enhanced team’s functionality should be considered as opposed to a complex business hierarchy because teams will provide better performance with a lesser drawback for security checks.
  • Field Level Security (FLS) Functionality: Field Level Security provides more granular control over the data that specified users can or cannot create, update or view. While FLS is a great feature to use the more comprehensively it is used in an implementation, the greater the impact on performance.

To learn more about Edgewater’s Microsoft Dynamics expertise visit www.fullscope.com

What I Learned at Health Connect Partners Surgery Conference 2012: Most Hospitals Still Can’t Tell What Surgeries Turn a Profit

What I Learned at Health Connect Partners Surgery Conference 2012: Most Hospitals Still Can’t Tell what Surgeries Turn a Profit

As I strolled around the Hyatt Regency at the Arch in downtown St. Louis amongst many of my colleagues in surgery and hospital administration, I realized I was experiencing déjà vu. Not the kind where you know you’ve been somewhere before. The kind where you know you’ve said the same thing before. Except, it wasn’t déjà vu. I really was having many of the same conversations I had a year ago at the same conference, except this time there was a bit more urgency in the voices of the attendees. It’s discouraging to hear that most large hospitals STILL can’t tell you what surgeries make or lose money! What surgeons have high utilization linked to high quality? What the impact of SSI’s are on ALOS? Why there are eight orthopedic surgeons, nine different implant vendors and 10 different total hip implant options on the shelves? It’s encouraging, though, to hear people FINALLY admit that their current information systems DO NOT provide the integrated data they need to analyze these problems and address them with consistency, confidence, and in real time.

Let’s start with the discouraging part. When asked if their current reporting and analytic needs were being met I got a lot of the same uninformed, disconnected responses, “yeah we have a decision support department”; “yeah we have Epic so we’re using Clarity”; “oh we just <insert limited, niche data reporting tool here>”. I don’t get too upset because I understand in the world of surgery, there are very few organizations that have truly integrated data. Therefore, they don’t know what they don’t know. They’ve never seen materials, reimbursement, billing, staffing, quality, and operational data all in one place. They’ve never been given consistent answers to their data questions. Let’s be honest, though – the priorities are utilization, turnover, and volume. Very little time is left to  consider the opportunities to drastically lower costs, improve quality, and increase growth by integrating data. It’s just not in their vernacular. I’m confident, though, that these same people are currently, more than ever, being tasked with finding ways to lower costs and improve quality – not just because of healthcare reform, but because of tightening budgets, stringent payers, stressed staff, and more demanding patients. Sooner or later they’ll start asking for the data needed to make these decisions – and when they don’t get the answers they want, the light will quickly flip on.

Now for the encouraging part – some people have already started asking for the data. These folks can finally admit they don’t have the information systems needed to bring operational, financial, clinical and quality data together. They have siloed systems – they know it, I know it, and they’re starting to learn that there isn’t some panacea off-the-shelf product that they can buy that will give this to them. They know that they spend way too much time and money on people who simply run around collecting data and doing very little in the way of analyzing or acting on it.

So – what now?! For most of the attendees, it’s back to the same ol’ manual reporting, paper chasing, data crunching, spreadsheet hell. Stale data, static reports, yawn, boring, seen this movie a thousand times. For others, they’re just starting to crack the door open on the possibility of getting help with their disconnected data. And for a very few, they’re out ahead of everyone else because they already are building integrated data solutions that provide significant ROI’s. For these folks, gone are the days of asking for static, snapshot-in-time reports – they have a self-service approach to data consumption in real time and are “data driven” in all facets of their organization. These are the providers that have everyone from the CEO down screaming, “SHOW ME THE DATA!”; and are the ones I want to partner with in the journey to lower cost, higher quality healthcare. I just hope the others find a way to catch up, and soon!

Managing Data Integrity

When was the last time you looked at a view of data, report, or graph in CRM and said to yourself, “This doesn’t look right”? You’re not alone. Keeping data up-to-date is a common issue for many organizations. We rely on its accuracy for decision making. An example of decision-making from data is determining which resource to assign to a project. If the project pipeline is inaccurate, a more senior resource might get tied up in a smaller project when their skillset would have been better used on a more important project. Another example might be deciding to make an investment based on erroneous forecasts of that investment’s future.

When data is out-of-date and you recognize this, the risk of an inaccurate decision is diminished as you have the opportunity to contact the data owner(s) to get an update. When it goes unnoticed, the risk of bad decisions increases. While there are many reasons why data can get out of date, there is often one common root cause: the person responsible for entering the data did so incorrectly or failed to do so. Rather than demonizing a person, we can look to find ways to make it easier for the data to be kept up to date.

There are many factors that go into data integrity:

Does the responsible party for the data entry also own the information gathering mechanism?

This can manifest when there is a team assigned to a record or there is a disconnect and/or lag in the data gathering process. For example, if there is a government agency that only provides updates periodically, but management needs information more frequently, this can present a problem. Possible solutions:

  • One record – one owner. No team ownership of a record.
  • Talk with management about the data they want and the source if outside the direct control of the responsible party. Have an open dialogue if the data gathering mechanism is flawed or doesn’t meet the needs of management to decide on a best course of action.

Does data have to be kept up-to-date real time or can it be done periodically?

Not all decisions have to be made ad-hoc. Some decisions can be deferred, occurring weekly or monthly. It is important that an organization examine the risk associated with each data element. Those that supply data feeding high-risk areas or decisions needing to be made more often need updates frequently from their data owners. Those with less risk or are used less-often can have less emphasis on being kept up to date. Remember, at the end of the day, a person, somewhere, had to provide that data. As individuals, no one is perfect and it is unreasonable to expect perfection on every record, every field, every time. Prioritize!

Can data be automated?

There are many tools available that can be added on to your software that automates data gathering. There are many companies that have created tools that, for example, go out to the web and pull in data updates related to a search topic. Consider installing or developing such tools where appropriate. This will reduce the need for a person in your organization to be assigned to this task. It will save time and money!

Consider using a tool’s workflow or a manually created workflow to help remind data owners to make updates.

Many data tools have built in workflows. These can be used to set tasks or send an email periodically for data owners reminding them to update a record. An example might be to create a field called “Last update” which should be changed each time a person reviews the record to make updates to important fields. If this data is more than a week old, an email can be sent to the data owner. Where such tools are not available in the tool, one could use their email application to have a reoccurring task or calendar item to remind them. At last resort, a sticky note on a physical calendar can do the trick!

Data is the life-blood of an organization. Keeping it up-to-date is important for decision making affecting both small and big outcomes. Most data comes from people. Help your people by setting up reasonable, sound business practices and processes around data integrity. It won’t prevent erroneous data, but you’ll find less of it and will make you and your data owner’s work-lives much easier. For a case study about how Edgewater has followed these practices, click here for more information.

Epic Clarity Is Not a Data Warehouse

It’s not even the reporting tool for which your clinicians have been asking!

I have attended between four and eight patient safety and quality healthcare conferences a year for the past five years. Personally, I enjoy the opportunities to learn from what others are doing in the space. My expertise lies at the intersection of quality and technology; therefore, it’s what I’m eager to discuss at these events. I am most interested in understanding how health systems are addressing the burgeoning financial burden of reporting more (both internal and external compliance and regulatory mandates) with less (from tightening budgets and, quite honestly, allocating resources to the wrong places for the wrong reasons).

Let me be frank: there is job security in health care analysts, “report writers,” and decision support staff. They continue to plug away at reports, churn out dated spreadsheets, and present static, stale data without context or much value to the decision makers they serve. In my opinion, patient safety and quality departments are the worst culprits of this waste and inefficiency.

When I walk around these conferences and ask people, “How are you reporting your quality measures across the litany of applications, vendors, and care settings at your institution?,” you want to know the most frequent answer I get? “Oh, we have Epic (Clarity)”, “Oh, we have McKesson (HBI),” or “Oh, we have a decision support staff that does that”. I literally have to hold back a combination of emotions – amusement (because I’m so frustrated) and frustration (because all I can do is laugh). I’ll poke holes in just one example: If you have Epic and use Clarity to report here is what you have to look forward to straight from the mouth of a former Epic technical consultant:

It is impossible to use Epic “out of the box” because the tables in Clarity must be joined together to present meaningful data. That may mean (probably will mean) a significant runtime burden because of the processing required. Unless you defer this burden to an overnight process (ETL) the end users will experience significant wait times as their report proceeds to execute these joins. Further, they will wait every time the report runs. Bear in mind that this applies to all of the reports that Epic provides. All of them are based directly on Clarity. Clarity is not a data warehouse. It is merely a relational version of the Chronicles data structures, and as such, is tied closely to the Chronicles architecture rather than a reporting structure. Report customers require de-normalized data marts for simplicity, and you need star schema behind them for performance and code re-use.”

You can’t pretend something is what it isn’t.

Translation that healthcare people will understand: Clarity only reports data in Epic. Clarity is not the best solution for providing users with fast query and report responses. There are better solutions (data marts) that provide faster reporting and allow for integration across systems. Patient safety and quality people know that you need to get data out of more than just your EMR to report quality measures. So why do so many of you think an EMR reporting tool is your answer?

There is a growing sense of urgency at the highest levels in large health systems to start holding quality departments accountable for the operational dollars they continue to waste on non-value added data crunching, report creation, and spreadsheets. Don’t believe me? Ask yourself, “Does my quality team spend more time collecting data and creating reports/spreadsheets or interacting with the organization to improve quality and, consequently, the data?”

Be honest with yourself. The ratio, at best, is 70% of an FTE is collection, 30% is analysis and action. So – get your people out of the basement, out from behind their computer screens, and put them to work. And by work, I mean acting on data and improving quality, not just reporting it.

Personnel, personnel, everywhere, nor any data to drink.

IT’S UNFORTUNATE: Large amounts of money are spent on new hires, yet little is left for employee and data improvement

I recently had an Executive Director of a Cancer Institute tell me,

“At this time, we plan to use simple spreadsheets for our database.  We are committing more than $500,000 for investment in personnel to start our translational laboratory this year.  I hope  we can subsist with simple spreadsheet use for our pilot studies.”

This sentiment immediately followed a detailed discussion, one that I’m very familiar with, concerning disparate researchers’ databases and how organizations’ needs remain unsatisfied, suffering from lack of integrated data.

Just so we’re all on the same page, let me make sure I understand this situation correctly –

  1. You are currently using “simple spreadsheets” to assist researchers with all things data. You’ve astutely noticed that these stale methods don’t meet your needs, and you agreed to a meeting with Edgewater because you’ve heard positive success stories from other cancer centers.
  2. You just spent three quarters of a million dollars on fresh staff for a new translational lab.
  3. You are now budget-constrained because of this arrangement and want these new hires to use “simple spreadsheets” to do their new job… the same ineffective and inefficient spreadsheets, of course, that caused the initial trouble.

Did I understand all that correctly? I didn’t grow up in the ’60s, so I’ll continue to pass on what he’s smoking.

So who wins with this strategy, you ask? No one!

We keep buying things thinking ‘that’ll look better’ and it just doesn’t

It’s unfortunate for the researchers because they continue to rely on an antiquated approach for data collection and analysis that will continue to plague this organization for years to come.

How many opportunities will be overlooked because a researcher becomes overwhelmed by his data?

It’s unfortunate for the organization because it’s nearly impossible to scale volumes (data aggregation, analysis, more clinical trials, more federal/state grant submissions, etc.) with such a fragmented approach. How much IP will walk out of the door for these organizations on those simple spreadsheets?

It’s unfortunate for the brand because it can’t market or advertise any advances, operationally or clinically, that will attract new patients.

It’s unfortunate for the patients because medicine as an industry collectively suffers when:

  • Surgeons under the same roof don’t recognize and notify their counterpart researchers that they have perfect candidates for the clinical trials they’re unaware of.
  • Executives continue to suffer budget declines from lower patient volumes and less additional revenue from industries partnering with cancer centers that have their act together.
  • Researchers under a single roof don’t know what each other are doing.

As in the picture above, “more” doesn’t necessarily mean “better.” Ancillary personnel and sheets of data don’t necessarily equate to a better outcome. Why continue to add more, knowing that this won’t solve the problem? Why infect more new hires with the same sick system? Why addition instead of introspection?

So, just as I told him in my response, I look forward to hearing from you in about 12-18 months; that’s roughly the amount of time it took the last dozen clients to call Edgewater back to save them from themselves.

Keeping the Black Swan at Bay

A recent article in the Harvard Business Review highlighted some alarming statistics on project failures. IT projects were overrunning their budgets by an average of 27%, but the real shocker was that one in six of these projects was over by 200% on average. They dubbed these epic failures the “black swans” of the project portfolio.

The article ends with some excellent advice on avoiding the black swan phenomenon, but the recommendations focus on two areas:

  • Assessments of the ability of the business to take a big hit
  • Sound project management practices such as breaking big projects down into smaller chunks, developing contingency plans, and embracing reference class forecasting.

We would like to add to this list a set of “big project readiness” tasks that offer additional prevention of your next big IT project becoming a black swan.

Project Management Readiness: If you don’t have seasoned PMs with successful big project experience on your team, you need to fill that staffing gap either permanently or with contract help for the big project. Yes, you need an internal PM even if the software vendor has their own PM.

Data Readiness:  Address your data quality issues now, and establish data ownership and data governance before you undertake the big project.

Process/organization/change management readiness: Are your current business processes well documented? Is the process scope of the big project defined correctly? Are process owners clearly identified?  Do you have the skills and framework for defining how the software may change your business processes, organization structure and headcounts? If not, you run a significant risk of failing to achieve anticipated ROI for this project. Do you have a robust corporate communication framework? Do you have the resources, skills and experience to develop and run training programs in house?

Let’s face it: experience matters. If you’re already struggling to recover from a technology black swan, you are at considerable risk for reproducing the same level of failure if you don’t undertake a radical overhaul of your approach by identifying and addressing every significant weakness in the areas noted above.

We have developed a project readiness assessment model that can help you understand your risks and develop an action plan for addressing them before you undertake anything as mission critical as an ERP replacement, CRM implementation,  legacy modernization or other mission critical technology project. If you have a big project on your radar (or already underway), contact makewaves@edgewater.com to schedule a pre-implementation readiness assessment.

Why are YOU going to the OR Management Conference this year?

This will be my second time at to the annual OR Management conference. I enjoyed the conference last year, which highlighted subject like operational efficiencies, new modalities of treatment, Lean methodologies, materials standardization, etc. I’m sure this year will offer similar educational opportunities, but that’s not why I’m going.
After working in an operating room and three years of attending Operating Room-related conferences and consulting with clients in the healthcare industry, it has become clear to me that there is a gap between what OR Directors and Managers do on a daily basis and the expectations their administrators set for them. Let’s face it: OR Managers and Directors are typically hired for their clinical experience. The shame is that, despite their credentials, these people end up spending the majority of their time putting out fires, managing surgeon and anesthesiologist egos, and fighting political battles. Unfortunately, very little time is spent “actually managing the OR like a business.”
However, this laundry list of management disasters does not negate executives’ expectations of the new OR director, who often ask directors to:

  • Lower variation of implant and material choices across service lines
  • Improve first  case on-time starts
  • Reduce SSI rates
  • Increase block and overall room utilization
  • Drop turnover time from 44 to 23 minutes

All of these demands have something in common – they require integrated data from multiple systems in the OR to analyze and address. However, when I talk to managers about their worries of integrating this data to efficiently address the executive demands, they are reluctant to change. The most common justification is, “Well, we already have a Corporate IS department,” or even “Well, we have [insert EMR vendor’s name here] tool for that.” This response makes me laugh (and cry) because it differentiates those who “get it” and those who don’t.
Every hospital is unique, every Operating Room with its own set of priorities, systems, processes, and people; there is currently no off-the-shelf or black-box solution to help an OR Manager actually manage an OR. Yes, there is a module for Quality somewhere over here, and maybe an app for Labor & Productivity over there, but there can be no standard comprehensive, scalable, extensible solution that accommodates the variety of clinical, financial, operational, research, market share, physician credentialing, materials management, and other disparate data sets of each hospital.

However, despite the strength of the solution, it is not a costly effort; the ROI is short-term and clear. There should be money in every budget to build these solutions, because they are built to help address immediate, short-term needs (such as better reporting for quality; analysis for standardizing implants in total joints) and long- term needs (such as multi-facility standardization, automated external and internal reporting of patient safety and quality measures, integrating health plan and other data for measuring true cost per case).
I’m going to the conference to see how many of the OR Managers embrace this approach, are eager to capitalize on the huge opportunities there are to save millions of dollars in the OR, and understand that Corporate IS departments don’t help the business users create solutions that can help you do this:

or this…?

Are you ready to embrace the opportunities your institution has starting with integrating your data? Will you join me at the OR Managers conference? I’d like to hear your unique needs and how we can collaborate and address them together.

The Never-Ending Burden of Reporting Patient Safety & Quality Metrics

Quick: how long does it take you to collect, aggregate, and report your SCIP, PN, AMI, and HF Core Measures? How about infection control metrics like rates of CLABSI, VAP, UTI, and MRSA? Or for that matter, any patient safety and quality metric that is mandated by JCAHO, CMS, your Department of Health, or anyone else? If you answered anything less than 2 months, and if I was a betting man, I’d bet you were lying.

There is a never-ending burden strapped to the backs of hospitals to collect, aggregate, analyze, validate, re-analyze, re-validate, report, re-validate, report again….quality measures. Reporting of these quality metrics is meant to benchmark institutions across the industry on their level of care, and inform patients of their treatment options. Fortunately for the majority of institutions, it is not difficult to achieve a high rate of compliance (>80-90%) because clinicians genuinely want to provide the best standards of care. Unfortunately though, the standards for achieving the highest designation according to CMS guidelines (achieving top percentile >99%) requires hospitals to allocate a disproportionate amount of time, money, and people to increase very small increments of compliance. I sat with a SCIP Nurse Abstractor last week and we spent 90 minutes drawing out, on 2 consecutive white boards, the entire process from start to finish of reporting SCIP core measures. There are over 50 steps, 5 spreadsheets/files, 4 hand-offs, 3 committees, and a partridge in a pear tree. It takes 2.5 months. I wonder how much money that is if you were to translate that time and effort into hard money spent? I also wonder what the return on investment is for that time, effort, and money. If we’re going to start running healthcare like a business, which I argue we should, this seems like a great place to start.

STEP 1: Reduce the amount of time spent on this process by ensuring the data is trustworthy There are way too many “validation” steps. Most people do not trust the data they’re given, and therefore end up re-validating according to their own unique way of massaging the data.

STEP 2: Integrate data from multiple sources so your Quality Abstractors and Analysts aren’t searching in 10 different places for the information they need. I’m currently helping a client implement interfaces for surgery, general lab, microbiology, blood bank, and pharmacy into their quality reporting system so their analysts can find all the information they need to report infection rates, core measures, and patient safety metrics. In addition, we built a Business Objects universe on top of the quality data store and they can do dynamic reporting in near real time. The amount of time saved is amazing and we have been successful in dramatically shifting the type of work these people are responsible for. The BI Capability Maturity Model below depicts our success helping them move from left to right.

STEP 3: Empower your analysts. With much more time to actually analyze the information, these people are the best candidates to help find errors in the data, delays in the process, and opportunities for improvement.

STEP 4: Create a mechanism for feedback based on the information you uncover. Both overachievers and underperformers alike need to be recognized for the appropriate reasons. Standardize on the best of what you find, and be sure to localize your intervention where the data is inaccurate or the process breaks down. This will also demonstrate greater transparency on your part.