Are you Paralyzed by a Hoard of Big Data?

Lured by the promise of big data benefits, many organizations are leveraging cheap storage to hoard vast amounts of structured and unstructured data. Without a clear framework for big data governance and use, businesses run the risk of becoming paralyzed under an unorganized jumble of data, much of which has become stale and past its expiration date. Stale data is toxic to your business – it could lead you into taking the wrong action based on data that is no longer relevant.

You know there’s valuable stuff in there, but the thought of wading through all THAT to find it stops you dead in your tracks.  There goes your goal of business process improvement, which according to a recent Informatica survey, most businesses cite as their number one Big Data Initiative goal.

Just as the individual hoarder often requires a professional organizer to help them pare the hoard and institute acquisition and retention rules for preventing hoard-induced paralysis in the future, organizations should seek outside help when they find themselves unable to turn their data hoard into actionable information.

An effective big data strategy needs to include the following components:

  1. An appropriate toolset for analyzing big data and making it actionable by the right people. Avoid building an ivory tower big data bureaucracy, and remember, insight has to turn into action.
  2. A clear and flexible framework, such as social master data management, for integrating big data with enterprise applications, one that can quickly leverage new sources of information about your customers and your market.
  3. Information lifecycle management rules and practices, so that insight and action will be taken based on relevant, as opposed to stale  information.
  4. Consideration of how the enterprise application portfolio might need to be refined to maximize the availability and relevance of big data. In today’s world, that will involve grappling with the flow of information between cloud and internally hosted applications as well.
  5. Comprehensive data security framework that defines who is entitled to use the data, change the data and delete the data, as well as encryption requirements as well as any required upgrades in network security.

Get the picture? Your big data strategy isn’t just a data strategy. It has to be a comprehensive technology-process-people strategy.

All of these elements, should of course, be considered when building your big data business case, and estimating return on investment.

Data Darwinism – Evolving your data environment

In my previous posts, the concept of Data Darwinism was introduced, as well as the types of capabilities that allow a company to set itself apart from its competition.   Data Darwinism is the practice of using an organization’s data to survive, adapt, compete and innovate in a constantly changing and increasingly competitive business environment.   If you take an honest and objective look at how and why you are using data, you might find out that you are on the wrong side of the equation.  So the question is “how do I move up the food chain?”

The goal of evolving your data environment is to change from using your data in a reactionary manner and just trying to survive, to proactively using your data as a foundational component to constantly innovate to create a competitive advantage.

The plan is simple on the surface, but not always so easy in execution.   It requires an objective assessment of where you are compared to where you need to be, a plan/blueprint/roadmap to get from here to there, and flexible, iterative execution.

Assess

As mentioned before, taking an objective look at where you are compared to where you need to be is the first critical step.  This is often an interesting conversation among different parts of the organization that have competing interests and objectives. Many organizations can’t get past this first step. People get caught up in politics and self-interest and lose sight of the goal; to move the organization forward into a competitive advantage situation. Other organizations don’t have the in-house expertise or discipline to conduct the assessment. However, until this can be done, you remain vulnerable to other organizations that have moved past this step.

Plan

Great, now you’ve done the assessment, you know what your situation is and what your strengths and weaknesses are.  Without a roadmap of how to get to your data utopia, you’re going nowhere.   The roadmap is really a blueprint of inter-related capabilities that need to be implemented incrementally over time to constantly move the organization forward.   Now, I’ve seen this step end very badly for organizations that make some fundamental mistakes.  They try to do too much at once.  They make the roadmap too rigid to adapt to changing business needs.   They take a form over substance approach.  All these can be fatal to an organization.   They key to the roadmap is three-fold:

  • Flexible – This is not a sprint.   Evolving your data environment takes time.   Your business priorities will change, the external environment in which you operate will change, etc.   The roadmap needs to be flexible enough to enable it to adapt to these types of challenges.
  • – There will be an impulse to move quickly and do everything at once.   That almost never works.   It is important to align the priorities with the overall priorities of the organization.
  • Realistic – Just as you had to take an objective, and possibly painful, look at where you were with respect to your data, you have to take a similar look at what can be done given any number of constraints all organizations face.   Funding, people, discipline, etc. are all factors that need to be considered when developing the roadmap.   In some cases, you might not have the internal skill sets necessary and have to leverage outside talent.   In other cases, you will have to implement new processes, organizational constructs and enabling technologies to enable the movement to a new level.  

Execute Iteratively

The capabilities you need to implement will build upon each other and it will take time for the organization to adapt to the changes.   Taking an iterative approach that focuses on building capabilities based on the organization’s business priorities will greatly increase your chance of success.  It also gives you a chance to evaluate the capabilities to see if they are working as anticipated and generating the expected returns.   Since you are taking an iterative approach, you have the opportunity to make the necessary changes to continue moving forward.

The path to innovation is not always an easy one.   It requires a solid, yet flexible, plan to get there and persistence to overcome the obstacles that you will encounter.   However, in the end, it’s a journey well worth the effort.

Data Darwinism – Capabilities that provide a competitive advantage

In my previous post, I introduced the concept of Data Darwinism, which states that for a company to be the ‘king of the jungle’ (and remain so), they need to have the ability to continually innovate.   Let’s be clear, though.   Innovation must be aligned with the strategic goals and objectives of the company.   The landscape is littered with examples of innovative ideas that didn’t have a market.  

So that begs the question “What are the behaviors and characteristics of companies that are at the top of the food chain?”    The answer to that question can go in many different directions.   With respect to Data Darwinism, the following hierarchy illustrates the categories of capabilities that an organization needs to demonstrate to truly become a dominant force.

Foundational

The impulse will be for an organization to want to immediately jump to implementing capabilities that they think will allow them to be at the top of the pyramid.   And while this is possible to a certain extent, you must put in place certain foundational capabilities to have a sustainable model.     Examples of capabilities at this level include data integration, data standardization, data quality, and basic reporting.

Without clean, integrated, accurate data that is aligned with the intended business goals, the ability to implement the more advanced capabilities is severely limited.    This does not mean that all foundational capabilities must be implemented before moving on to the next level.  Quite the opposite actually.   You must balance the need for the foundational components with the return that the more advanced capabilities will enable.

Transitional

Transitional capabilities are those that allow an organization to move from silo’d, isolated, often duplicative efforts to a more ‘centralized’ platform in which to leverage their data.    Capabilities at this level of the hierarchy start to migrate towards an enterprise view of data and include such things as a more complete, integrated data set, increased collaboration, basic analytics and ‘coordinated governance’.

Again, you don’t need to fully instantiate the capabilities at this level before building capabilities at the next level.   It continues to be a balancing act.

Transformational

Transformational capabilities are those that allow the company to start to truly differentiate themselves from their competition.   It doesn’t fully deliver the innovative capabilities that set them head and shoulders above other companies, but rather sets the stage for such.   This stage can be challenging for organizations as it can require a significant change in mind-set compared to the current way its conducts its operations.   Capabilities at this level of the hierarchy include more advanced analytical capabilities (such as true data mining), targeted access to data by users, and ‘managed governance’.

Innovative

Innovative capabilities are those that truly set a company apart from its competitors.   They allow for innovative product offerings, unique methods of handling the customer experience and new ways in which to conduct business operations.   Amazon is a great example of this.   Their ability to customize the user experience and offer ‘recommendations’ based on a wealth of user buying  trend data has set them apart from most other online retailers.    Capabilities at this level of the hierarchy include predictive analytics, enterprise governance and user self-service access to data.

The bottom line is that moving up the hierarchy requires vision, discipline and a pragmatic approach.   The journey is not always an easy one, but the rewards more than justify the effort.

Check back for the next installment of this series “Data Darwinism – Evolving Your Data Environment.”

Data Darwinism – Are you on the path to extinction?

Most people are familiar with Darwinism.  We’ve all heard the term survival of the fittest.   There is even a humorous take on the subject with the annual Darwin Awards, given to those individuals who have removed themselves from the gene pool through, shall we say, less than intelligent choices.

Businesses go through ups and downs, transformations, up-sizing/down-sizing, centralization/ decentralization, etc.   In other words, they are trying to adapt to the current and future events in order to grow.   Just as in the animal kingdom, some will survive and dominate, some will not fare as well.   In today’s challenging business environment, while many are trying to merely survive, others are prospering, growing and dominating.  

So what makes the difference between being the king of the jungle or being prey?   The ability to make the right decisions in the face of uncertainty.     This is often easier said than done.   However, at the core of making the best decisions is making sure you have the right data.   That brings us back to the topic at hand:  Data Darwinism.   Data Darwinism can be defined as:

“The practice of using an organization’s data to survive, adapt, compete and innovate in a constantly changing and increasingly competitive business environment.”

When asked to assess where they are on the Data Darwinism continuum, many companies will say that they are at the top of the food chain, that they are very fast at getting data to make decisions, that they don’t see data as a problem, etc.   However, when truly asked to objectively evaluate their situation, they often come up with a very different, and often frightening, picture. 

  It’s as simple as looking at your behavior when dealing with data:

If you find yourself exhibiting more of the behaviors on the left side of the picture above, you might be a candidate for the next Data Darwin Awards.

Check back for the next installment of this series “Data Darwinism – Capabilities that Provide a Competitive Advantage.”

Clinical Alerts – Why Good Intentions Must Start as Good Ideas

As the heated debate continues about ways to decrease the costs of our healthcare system while simultaneously improving its quality, it is critical to consider the most appropriate place to start – which depends on who you are. Much has been made about the advantages of clinical alerts especially with their use in areas high on the national radar like quality of care, medication use and allergic reactions, and adverse events.   Common sense, though, says walk before you run; in this case its crawl before you run. 

Clinical alerts are most often electronic messages sent via email, text, page, and even automated voice to notify a clinician or group of clinicians to conduct a course of action related to their patient care based on data retrieved in a Clinical Decision Support System (CDSS) designed for optimal outcomes. The rules engine that generates alerts is created specifically for various areas of patient safety and quality like administering vaccines to children, core measure compliance, and preventing complications like venous thromboembolism (VTE) (also a core measure). The benefits of using clinical alerts in various care settings are obvious if the right people, processes, and systems are in place to consume and manage the alerts appropriately. Numerous studies have been done highlighting the right and wrong ways of implementing and utilizing alerts. The best criteria I’ve seen used consider 5 major themes when designing alerts: Efficiency, Usefulness, Information Content, User Interface, and Workflow (I’ve personally confirmed each of these from numerous discussions with clinicians ranging from ED nurses to Anesthesiologists in the OR to hospitalists on the floors). And don’t forget one huge piece of the alerting discussion that often gets overlooked…….the patient! While some of these may be obvious, all must be considered as the design and implementation phases of the alerts progress.

OK, Now Back to Reality

A discussion about how clinical alerting can improve the quality of care is one limited to the very few provider organizations that already have the infrastructure setup and resources to implement such an initiative. This means that if you are seriously considering such a task, you should already have:

  • an Enterprise Data Strategy and Roadmap that tells you how alerts tie into the broader mission;
  • Data Governance  to assign ownership and accountability for the quality of your data and implement standards (especially when it comes to clinical documentation and data entry);
  • standardized process flows that identify points for consistent, discrete data collection;
  • surgeon, physician, anesthesiology, nursing, researcher, and hospitalist champions to gather support from various constituencies and facilitate education and buy-in; and
  •  oh yeah, the technology and skilled staff to support a multi-system, highly integrated, complex rules-based environment that will likely change over time and be more scrutinized………

◊◊Or a strong relationship with an experienced consulting partner capable of handling all of these requirements and transferring the necessary knowledge along the way.◊◊

I must emphasize the second bullet for just a moment; data governance is critical to ensure that the quality of the data being collected passes the highest level of scrutiny, from doctors to administrators. This is of the utmost importance because the data is what forms the basis of the information that decision makers act on. The quickest way to lose momentum and buy in to any project is by putting bad data in front of a group of doctors and clinicians; trust me when I say it is infinitely more difficult to win their trust back once you’ve made that mistake. On the other hand, if they trust the data and understand the value of it in near real time across their spectrum of care, you turn them quickly into leaders willing to champion your efforts. And now you have a solid foundation for any healthcare analytics program.    

If you are like the majority of healthcare organizations in this country, you may have some pieces to this puzzle in various stages of design, development, deployment or implementation. In all likelihood, though, you are at the early stages of the Clinical Alerts Maturity Model

 

and with all things considered, should have alerting functionality in the later years of your strategic roadmap. Though, there are many  projects with low cost, fast implementations, quick ROIs, and ample examples to glean lessons learned from like, Computerized Physician Order Entry (CPOE), electronic nursing and physician documentation, Picture Archiving System (PACS), and a clinical data repository (CDR) to use alerting as a prototype or proof of concept to demonstrate the broader value proposition. Clinical alerting, to start, should be incorporated alongside projects that have proven impact across the Clinical Alerts Maturity Model before they are rolled out as stand-alone initiatives.

From Free Text Clinical Documentation to Data-rich Actionable Information

Hey healthcare providers! Yeah you the “little guy”, the rural community hospital; or you the “average Joe”, the few-hundred bed hub hospital with outpatient clinics, an ED, and some sub-paper-pilespecialties; or you the “behemoth”, the one with the health plan, physician group, outpatient, inpatient, and multi-discipline, multi-care setting institution. Is your EMR really just an electronic filing cabinet? Do nursing and physician notes, standard lab and imaging orders, registration and other critical documents just get scanned into a central system that can’t be referenced later on to meet your analytic needs? Don’t worry, you’re not alone…

Recently, I blogged about some of the advantages of Microsoft’s new Amalga platform; I want to emphasize a capability of Amalga Life Sciences that I hope finds its way into the range of healthcare provider organizations mentioned above, and quick! That is, the ability to create adoctor microscope standard ontology for displaying and navigating the unstructured information collected by providers across care settings and patient visits (see my response to a comment about Amalga Life Science utilization of UMLS for a model of standardized terminology). I don’t have to make this case to the huge group of clinicians already too familiar with this process in hospitals across the country; but the argument (and likely ROI) clearly needs to be articulated for those individuals responsible for transitioning from paper to digital records at the organizations who are dragging their feet (>90%). The question I have for these individuals is, “why is this taking so long? Why haven’t you been able to identify the clear cut benefits from moving from paper-laden manual processes to automated, digital interfaces and streamlined workflows?” These folks should ask the Corporate Executives at hospitals in New Orleans after Hurricane Katrina whether they had hoped to have this debate long before their entire patient population medical records’ drowned; just one reason why “all paper” is a strategy of the past.   

Let’s take one example most provider organizations can conceptualize: a pneumonia patient flow through the Emergency Department. There are numerous points throughout this process that could be considered “data collection points”. These, collectively and over time, paint a vivid picture of the patient experience from registration to triage to physical exam and diagnostic testing to possible admission or discharge. With this data you can do things like real or near-real time clinical alerting that would improve patient outcomes and compliance with regulations like CMS Core Measures; you can identify weak points or bottlenecks in the process to allocate additional resources; you can model best practices identified over time to improve clinical and operational efficiencies. Individually, though, with this data written on a piece of paper (and remember 1 piece of paper for registration, a separate piece for the “Core Measure Checklist”, another for the physician exam, another for the lab/X-ray report, etc.) and maybe scanned into a central system, this information tells you very little. You are also, then, at the mercy of the ability to actually read a physicians handwriting and analyze scanned documents of information vs. delineated data fields that can be trended over time, summarized, visualized, drilled down to, and so on.11-3 hc analytics

Vulnerabilities and Liabilities from Poor Documentation

Relying on poor documentation like illegible penmanship, incomplete charting and unapproved abbreviations burdens nurses and creates a huge liability. With all of the requirements and suggestions for the proper way to document, it’s no wonder why this area is so prone to errors. There are a variety of consequences from performing patient care based on “best guesses” when reading clinical documentation. Fortunately, improving documentation directly correlates with reduced medical errors. The value proposition for improved data collection and standardized terminology for that data makes sense operationally, financially, and clinically.   

So Let’s Get On With It, Shall We?

Advancing clinical care through the use of technology is seemingly one component of the larger healthcare debate in this country centered on “how do we improve the system?” Unfortunately, too many providers want to sprint before they can crawl. Moving off of paper helps you crawl first; it is a valuable, achievable goal across that the majority of organizations burdened with manual processes and their costs and if done properly, the ROI can be realized in a short amount of time with manageable effort. Having said this, the question quickly then becomes, “are we prepared to do what it takes to actually make the system improve?” Are you?

Enterprise Information Strategy in 2009

If you’ve been in an IT-related role for more than 10 years, you’ve likely enjoyed the boom and bust the economy has provided. Healthier times enhance business capabilities in the form of multi-million dollar, cross organization implementations, while leaner times like these afford only the most critical needs to be fulfilled. So while the volatility wreaks havoc on your organization, one IT spend continues to stay strong. Strategy.

Strategy is a sound investment in prosperous times since the confirmation it provides protects the investment of larger scale initiatives. For example, a company committed to providing better customer service and market additional products into its customer base will undertake a 2 to 3 year set of tactical CRM initiatives. Success factors include the three usual suspects, ‘People, Process and Technology’, and aligning each with an ideal future state vision is critical.

A well executed strategy provides an education for stakeholders and builds consensus among individuals who may have never sat around the same conference room table before. It coalesces and prioritizes goals and objectives, drafts a future state architectural blueprint and describes business processes that will endure, and establishes a long term Road Map that orchestrates incremental efforts and illustrates progress.

So if strategy is a safe bet in better times, why invest in one now?  For executives I’ve met with most recently (Q3 and Q4 2008), a popular form of strategy is analogous to grandma’s attic. At some point, it may have occurred to you to look in grandma’s attic for something that may be useful, and if you’re truly fortunate, there may be something extremely valuable you hadn’t counted on. For C-level executives looking for ways to improve their bottom lines, the same treasure hunt exists in the corporate information they already possess.

To understand whether your enterprise information holds hidden treasures, explore these 10 questions with your organization. Answering ‘No’ or ‘Not sure’ to any questions that have exceptional relevance within your organization may suggest looking into an Enterprise Information Strategy enagement.

  1. Do visionaries within my company have visibility to key performance indicators that drive revenue or lower costs?
  2. Do I understand who my customers are and which products they own?
  3. Am I able to confidently market additional products into my existing customer base?
  4. Do I possess data today that would provide value to complimentary industries through new information based offerings?
  5. Will my information platforms readily scale and integrate to meet the demands of company growth through acquisition?
  6. Am I leveraging the most cost effective RDBMS software and warehouse appliance technologies?
  7. Do I understand the systemic data quality issues that exist in the information I disseminate?
  8. Do the organizations I support understand the reporting limitations of my current state architecture?
  9. Is there an architectural blueprint that illustrates an ideal 2 to 3 year business intelligence future state?
  10. Does my company have visibility to a Road Map that timelines planned projects and the incremental delivery of new business insights?

Why creating actionable information from your existing systems is so difficult

With all the easy to use business intelligence tools and technology we have today, why is it so difficult to create actionable information from the wealth of data in our organizations?

One needs to understand, at a high level, the systems we have built and how they got that way. Your core business systems have evolved over time, budget cycle by budget cycle with no eye towards the overall enterprise. Systems were built to support core business functions – Payroll/HR, General Ledger, Inventory, etc. They were transactional in nature; designed to meet the immediate requirements (e.g. cut payroll checks, track inventory, manage an assembly line, etc.) which did not include getting business intelligence out. Over time these systems became islands of data, popularly known as silos.

Add the fact that silos are structured differently and common data like product and customer is typically not standardized, answering questions across silos is difficult and labor intensive.

As these systems matured, the owners of each silo had departmental Business Intelligence needs. So as budget became available they added a data warehouse or data mart on top of their silo and created something like this.

The result is larger silos with larger sunk investment and still no ability to provide enterprise answers or actionable information. This approach worked for immediate departmental BI needs but if the business asks a question from data that resides in two or more of the silos, getting the answer usually involves a significant IT effort. By the time IT responds the business has gone onto a different question. The business analyst starts gluing spreadsheets together to provide some insight kicking off the next activity in the BI food chain – manual analytics.