Five keys to thriving during hypergrowth

When your successful strategy pays off and you find your business in a period of hypergrowth, keeping everything moving forward in alignment (instead of spinning out of control) is your biggest challenge. Here are five keys to sustaining your success:

1. Work smarter, not harder – review your business processes and look for ways to eliminate tasks that don’t add significant value, or automate manual handoffs.

2. Great tools are always a good investment – you can’t sustain hypergrowth with yellow pads and Excel spreadsheets. Put more power into the hands of key users, so they don’t have to rely on IT for queries and reports.

3. Keep an eye on profits while focusing on growth. Sustain your sales momentum, but eliminate waste and manage your profit margins.  Make sure you are getting maximum value out of your marketing efforts, as well as keeping an eye on your cost of goods sold.

4. Bureaucracy strangles growth – your backoffice organization should avoid imposing cumbersome processes on the parts of your business that sell, produce and deliver your products and services. Use effective collaboration tools to cut the middlemen out of your business processes.

5. Choose meaningful KPI’s. Less is more–they aren’t KEY performance indicators if you have a list of 20 KPI’s  for one area of the business. Hypergrowth KPI’s differ from downturn KPI’s.  

If you are in a rapid growth phase, what are you tracking now? If you are hoping to achieve hypergrowth what are your KPI’s? Leave us a comment.

Product-based Solutions Versus Custom Solutions : Tomb Raider or Genesis?

The Product-based Solution is where most of Corporate America is going for IT today.  The talent required to povide a successful implementation (one you actually renew license maintenance on rather than let let quietly die an ignominious death) requires the tenacity, deep specialized product  knowlege (read arcane dark arts), and courage of a cinema Tomb Raider.  The team required has to know the target product as well as Indiana Jones knows Egyptology; with equivalent courage, problem-solving skills, and morals (one can’t be squeamish hacking a solution into submission) to be able to achieve a usable solution versus an embarrassing product snap-in.   In addition to their product skills the team must be able to quickly navigate the jungle of existing applications with their mysterious artifacts to get the proper integration points and data (Gold! Gold! I say!).

What if the team can’t or don’t navigate your jungle of existing applications or do not know all of the idiosyncracies of the product to be installed?  Well you get an Embarrassing Product Snap-In (Do Not Pass Go, Do Not Collect $200, Do Flush Career).  Every seasoned IT professional has seen one of these puppies, they are the applications you can’t get anyone to use.  Usually because the do not connect to anything users currently work with, or have real usability issues (Harry Potter vs. MIT interface).  Yes, the product is in.  Yes, it tests to the test plan criteria.  Yes, it looks like post-apocalypse Siberia as far as users are concerned (What if we install CRM and no one comes? Ouch! no renewal for Microsoft/Oracle, bummer).

Custom Solutions are more like Genesis, Let there be Light! (ERP, CRM, Order-Entry, you get the picture).  It is a Greenfield Opportunity!  The team you need is just as talented as a Product-based Solution, but very different.  They need to be able to create a blueprint of your desires, like a rock star architect for a signature building.  The team needs to be experts in software engineering and technology best practice.  As well, the team needs to be able to translate your user’s meandering descriptions of what they do (or not) into rational features resembling business process best practice.  That was Easy!

In the case of custom the risk is creating Frankenstein, rather than new life (It’s Alive!, It’s Alive!).  Again, every seasoned IT professional has seen one of these embarrassing creations (Master, the peasants/users are at the gate with pitchforks and torches!).  The end result of one of these bad trips (Fear and Loathing in ERP) is the same, but usually more expensive, than the Product-based alternatives.

Debby Downer what should I do?  Reality is as simple as it is hard; pick the right solution for the organization, Product-based or Custom.  Then get the right team, Tomb Raider or the Great Architect of Giza.

How does a data-driven healthcare organization work?

As the pressure increases for accountability and transparency for healthcare organizations, the spotlight is squarely on data: how does the organization gather, validate, store and report it.  In addition, the increasing level of regulatory reporting is driving home a need for certifying data – applying rigor and measurement to its quality, audit, and lineage.  As a result, a healthcare organization must develop an Enterprise Information Management approach that zeros in on treating data as a strategic asset.  While treating data as an asset would seem to be obvious given the level of IT systems necessary to run a typical healthcare organization, the explosion of digital data collected and types of digital data (i.e. video, digital photos, audio files) has overwhelmed the ability to locate, analyze and organize it.

A typical example of this problem comes when an organization decides to implement Business Intelligence or performance indicators with an electronic dashboard.  There are many challenges in linking data sources to corporate performance measures.  When the same data element exists in multiple places, i.e. patient IDs, encounter events, then there must be a decision about the authoritative source or “single version of the truth.” Then there is the infamous data collision problem: Americans move around and organizations end up with multiple addresses for what appears to be the same person, or worse yet, multiple lists of prescribed medications that don’t match.  The need to reconcile data discrepancies requires returning to the original source of information – the patient to bring it to a current status.  Each of us can relate to filling out the form on the clipboard in the doctor’s office multiple times.  Finally, there is the problem of sparseness – we have part of the data for tracking performance but we don’t have enough for the calculation.  This problem can go on and on, but it boils down to having the right data, at the right time and using it in the right manner.

Wouldn’t the solution simply be to create an Enterprise Data Warehouse or Operational Data Store that has all of the cleansed, de-duplicated, latest data elements in it?  Certainly!  Big IF coming up: IF your organization has data governance to establish a framework for audit-ability of data; IF your organization can successfully map source application systems to the target enterprise store; IF your organization can establish master data management for all the key reference tables; IF your organization can agree on standard terminologies, and most importantly, IF you can convince every employee that creates data that quality matters, not just today but always.

One solution is to understand a key idea that made personal computers a success – build an abstraction layer.  The operating system of a personal computer established flexibility by hiding the complexity of different hardware items from the casual user through a hardware abstraction layer that most of us think of as drivers.  A video driver, a CD driver, USB driver allows the modularity and allows flexibility to adapt the usefulness of the PC.  The same principle applies to data-driven healthcare organizations.  Most healthcare applications try to tout their ability to be the data warehouse solution.  However, the need for the application to improve over time introduces change and version control issues, thus instability in the enterprise data warehouse.  In response, moving the data into an enterprise data warehouse creates the abstraction layer and the extract, transform and load (ETL) process can act like the drivers in the PC example.  Then as the healthcare applications move through time, they do not disrupt the Enterprise Data Warehouse, its related data marts and, most importantly, the performance management systems that run the business.  It is not always necessary to move the data in order to create the abstraction layer, but there are other benefits to that approach including the retirement of legacy applications.

In summary, a strong data-driven healthcare organization has to train and communicate the importance of data as a support for performance management and get the buy-in from the moment of data acquisition through the entire lifecycle of that key data element.  The pay-offs are big: revenue optimization, risk mitigation and elimination of redundant costs.  When a healthcare organization focuses on treating data as a strategic asset, then it changes the outcome for everyone in the organization, and restores trust and reliability for making key decisions.

Data Profiling: The BI Grail

In Healthcare analytics, as in analytics for virtually all other businesses, the landscape facing the Operations, Finance, Clinical, and other organizations within the enterprise is almost always populated by a rich variety of systems which are prospective sources for decision support analysis.   I propose that we insert into the discussion some ideas about the inarguable value of, first, data profiling, and second, a proactive data quality effort as part of any such undertaking.

Whether done from the ground up or when the scope of an already successful initial project is envisioned to expand significantly, all data integration/warehousing/business intelligence efforts benefit from the proper application of these disciplines and the actions taken based upon their findings, early, often, and as aggressively as possible.

I like to say sometimes that in data-centric applications, the framework and mechanisms which comprise a solution are actually even more abstract in some respects than traditional OLTP applications because, up to the point at which a dashboard or report is consumed by a user, the entire application virtually IS the data, sans bells, whistles, and widgets which are the more “material” aspects of GUI/OLTP development efforts:

  • Data entry applications, forms, websites, etc. all exist generally outside the reach of the project being undertaken.
  • Many assertions and assumptions are usually made about the quality of that data.
  • Many, if not most, of those turn out not to be true, or at least not entirely accurate, despite the very earnest efforts of all involved.

What this means in terms of risk to the project cannot be overstated.   Because it is largely unknown in most instances it obviously can neither be qualified nor quantified.   It often turns what seems, on the face of it, to be a relatively simple “build machine X” with gear A, chain B, and axle C project into “build machine X” with gear A (with missing teeth), chain B (not missing any links but definitely rusty and needing some polishing), and axle C (which turns out not to even exist though it is much discussed, maligned, or even praised depending upon who is in the room and how big the company is).

Enter The Grail.   If there is a Grail in data integration and business intelligence, it may well be data profiling and quality management, on its own or as a precursor to true Master Data Management (if that hasn’t already become a forbidden term for your organization due to past failed tries at it).

Data Profiling gives us a pre-emptive strike against our preconceived notions about the quality and content of our data.   It gives us not only quantifiable metrics by which to measure and modify our judgement of the task before us, but frequently results in various business units spinning off immediately into the scramble to improve upon what they honestly did not realize was so flawed.

Data Quality efforts, following comprehensive profiling and any proactive quality correction which is possible, give a project the possibility of fixing problems without changing source systems per se, but before the business intelligence solution becomes either a burned out husk on the side of the EPM highway (failed because of poor data), or at the least a de facto data profiling tool in its own right, by coughing out whatever data doesn’t work instead of serving its intended purpose- to deliver key business performance information based upon a solid data foundation in which all have confidence.

The return on investment for such an effort is measurable, sustainable, and so compelling as an argument that no serious BI undertaking, large or small, should go forward without it.   Whether in Healthcare, Financial Services, Manufacturing, or another vertical,  its value is, I submit, inarguable.

Doublin’ Down in Hard Times

Hard times are definitely here.  By this time everybody in IT-land has done the obvious: frozen maintenance where possible, put off hardware and software upgrades, outsourced where possible, trimmed heads (contractors, consultants, staff), pushed BI/CPM/EPM analytics projects forward, and tuned up data and web resources.

Now is the time to think outside the bunker!

IT needs to consider what will need to be done to nurture the green shoots poking through the nuclear fallout. All of the talking heads and pundits see them ( glowing with radiation or whatever) and  the utmost must be done to make sure they survive and grow or we shall all sink into the abyss!

This is the time to double down in IT (poker speak).  It is not about heavily hyped Cloud Computing or the latest must-have tech gadget, but about something much more mundane and boring: improving the business process.  There, I’ve said it, what could possibly be more boring?  It doesn’t even plug-in.  In fact (shudder!), it may be partially manual.

Business process is what gets the job done (feeding our paychecks!).  Recessions are historically the perfect time to revise and streamline (supercharge ’em!)  existing business processes because it allows the company to accelerate ahead of the pack coming out of the recession.  In addition, recession acts as something of a time-out for everybody (I only got beatings, no time-outs for me), like the yellow flag during a NASCAR race.  When the yellow flag is out, time to hit the pits for gas and tires.  Double down when it is slow to go faster when things speed up again, obviously the only thing to do.

How? is usually the question.  The best first step is to have existing business processes documented and reviewed.  Neither the staff involved driving the process at the moment nor the business analysts (internal or consultants) are that busy at the moment.  That means any economic or dollar cost of doubling will be minimized under the economic yellow flag.  The second step is to look for best practice, then glance ouside-the-box to maximize improvement.  The third step is to look for supporting technology to supercharge the newly streamlined business process (I knew I could get some IT in there to justify my miserable existance!).

Small and medium businesses get the biggest bang for the buck (just picture trying to gas and change the tires on the Exxon Valdez at Daytona) with this strategy.  This process allows SMBs to leapfrog the best practice and technology research the Global 2000 have done and cut to the chase without the pioneer’s cost (damn those arrows in the backside hurt!).  Plus implementation is cheaper during recession ( I love to be on the buy-side).  The hardware, software, and integration guys have to keep busy so they cut prices to the bone.

The way forward is clear, IT only needs to lead the way, following is kind of boring anyway.

Change: Opportunity or Agony

jenga“Embracing change” is a common mantra. However, experiencing change is a certain reality. With it comes a series of choices for everyone involved. Perhaps, the game of Jenga(tm) demonstrates these choices. As you may know, Jenga consists of wooden blocks shaped like tiny beams. The game starts with the beams stacked tightly, three per layer, alternating them vertically and horizontally. The object is to manually dislodge any block from the tower and place it on a new layer at the very top; expanding the tower upwards until it topples from lack of support below or is blown by a strong gust of wind.

Like Jenga, a business also grows using its assets, strengths and opportunities to build customers and market share.

To continue comparing Jenga to running an enterprise, perhaps you could use two different perspectives. The player of the game is like the executives of the organization, moving around structural blocks to expand the organization. This executive has a 360 degree view of the tower with the ability to stress test the blocks before dedicating them for the move; and can scope the environment for threats to the construction such as a shaky playing table or strong winds.

The contrasting view is that of the employees impacted by the move within an organization; perhaps visualized as tiny ants clinging to the moved block. These individuals have an intimate knowledge of this specific block. They know each dent, scratch and slight change in color. They know how snugly it fits against the neighboring blocks (the nitty-gritties necessary to accomplish a job) and how it informally interacts with others. But this internal perspective lacks the comprehensive view. From within the safety of the tightly-built fortress, workers may not sense the unstable foundation or feel the gusts.

As a block is selected those associated with it can be hurled into significant change. One’s first reaction at the vibration may be to grab on as hard as possible to the comfort of the block. Despite the desperation, it takes very little time to see that the forces are overpowering and a significant change is imminent. At this point, there are really two broad choices: resist or cooperate.

The consequences of the first choice, resistance, can lead to demise. To explain this, let’s consider the two forms of resistance – denial and defiance.

Denying the seriousness of the changing forces will severely cripple the industry. Current examples of underestimating the impact of an impending change are seen with traditional media. After reluctance, newspapers, magazines, local broadcast television and radio eventually adopted the Internet. Through applying their respective traditional medium’s paradigm to the Internet forum, they used it as the broadcasting and publishing vehicle. Newspapers, for example, started by replicating their publication online and updating the sites daily after street publication. Internet users expecting more immediate news discontinued their subscriptions to the physical newspapers and started viewing news on new Internet news sites that refreshed content frequently.

The other form of resistance, defiance, could cause alienation with peers who tire of negative attitudes. Excessive defiant behavior could lead to dismissal from those who perceive it as obstructive.

In contrast, the option of cooperation, could lead to quite different outcomes. If the change is from competitive or industrial pressure, adapting to the changes’ new opportunities could put you in the driver’s seat. Those Internet sites that enabled the viewer to customize content offer an example of seizing the opportunity to lead the industry. In Jenga, a beam moved to the top is exposed to uncomfortable drafts, unfamiliar elements and added visibility. The gusts and vulnerability could be threatening. Also, the fall is farther if knocked off. However, the experiences gained are the essence of leadership.

Another recent example is the trend to stop travel expense. Geographically dispersed employees, trainers and consultants can overcome this obstacle by mastering the various technologies to be productive remotely. As organizations adopt these methods, the paradigms of phone etiquette, correspondence and meeting presentations will morph into new standards. Those of us who have adapted will benefit professionally.

Other gloomy headlines tout that many companies have fallen, or as in Jenga, the towers have toppled. Those who have fallen into the heap are left with the challenge to adapt to a new reality. After some brushing off, skills can be applied to participate in a new tower. Existing knowledge and tools will be augmented by wisdom for the next cycle of industrial changes.

As professionals, we need to recognize that external forces will cause us to make some hard decisions. To react with leadership, we should seek opportunities in the changes, communicate the realities and urge others to accept them.

Budgeting from the trenches

Have you ever noticed how text books understate the budgeting process? They tend to gloss over the topic as four steps:

  1. Determine revenues
  2. Forecast expenses
  3. Adjust
  4. Communicate

Some text books suggest that that the process has iterations. This general outline of the process rings true, but its oversimplification makes the budgeting process sections meaningless when it comes time to map one out. I have found that undertaking the budgeting challenge is different between organizations. The process design is similar to perhaps how Generals draw up battle plans.tactics_image The available personnel, supplies and equipment are assessed and the desired outcome is clear. However, the details of the approach are dependent on the specific terrain and rely on the latest tools and information. For this reason, organizations tend to see its budgeting strategy as unique.

Strategy is a fair term to use in budgeting as its outcome has a great deal at stake. Every staff member submitting input for calculations or making a request for funds has credibility on the line. Without complete information the profitability of a product, service, region or division is at jeopardy. And, day-to-day performance of the organization can be besieged from the pressure and time consumption when gathering intelligence from the field.

There is a point where this analogy between a battle plan and a budgeting process falls apart: That is, a battle will end and budgeting does not. A budget plan will play itself over and over. This exposes a point of vulnerability in the budgeting process as it was designed for a set of conditions that most likely has changed. It may no longer be sufficient to budget annually. Reporting requirements may change. Consolidations in the industry confuse the financial results. Or, new competitors, products, clients, regions and staff render the plan obsolete. When there is such a difference between the framework and reality, the budgeting framework cannot be trusted for strategic forecasting.

In the wake of the global financial crisis as organizations seek to maximize cash reserves, evaluate expenses and eliminate risk; the budget process surfaces as a key strategy. Those giving strategic input and making decisions have unprecedented pressures to assure accuracy and agility in cost cutting. Those who need to find opportunities for revenue are at a loss for validating an option’s viability. An organization is likely to forgo an opportunity without the ability to articulate its profitability, avoiding the risk of catastrophe.

Today’s battlefield is dynamic and most participants are deep in the trenches. We know that this gloomy economy will end and we intend to abandon the trench to take new ground. Our challenge is timing and selecting the method to move forward. While we are trenched, let’s review the budgeting tools and design a system giving us the agility to adapt to the changing markets, locate opportunities and operate effectively.

Cutting costs should not mean cutting revenue.

0925_mz_skinflint

Image courtesy of BusinessWeek 9/25/08: "AmerisourceBergen's Scrimp-and-Save Dave"

The financial panic of late has caused a lot of attention on cutting costs – from frivolities like pens at customer service counters to headcount – organizations are slowing spending. Bad times force management to review every expense, and in these times obsess with them. Financial peace however has two sides – expense and revenue.

A side effect of cost cutting can be stunted revenue, over both the short and long terms. It is easier to evaluate costs than to uncover revenue opportunities, such as determining  truly profitable offerings and adapting your strategies to maximize sales. Also as difficult to quantify are the true loses in unprofitable transactions, and competitive strategies that can negatively impact your competition.

The answers to many of these questions can be  unearthed from data scattered around an organization, groking customers and instantly shared knowledge between disciplines. For example, by combining:

  • customer survey data;
  • external observations;
  • clues left on web visits;
  • and other correspondence within the corporation;

…an organization can uncover unmet needs to satisfy before the competition, and at reduced investment cost.

When external factors, like a gloomy job outlook, cause customers to change behavior, it is time to use all information at your disposal. Those prospects changing preferences for your offerings can provide golden intelligence about the competition or unmet needs.

Pumping information like this is the heart of business intelligence. Marketing and Sales can uncover the opportunity; however, it is up to the enterprise to determine how to execute a timely offering. Financials, human capital planning, and operations, work in concert to develop the strategy which requires forecasting data, operational statistics and capacity planning data to line up.

A good strategist views all angles, not just reducing cost.

Enterprise Information Strategy in 2009

If you’ve been in an IT-related role for more than 10 years, you’ve likely enjoyed the boom and bust the economy has provided. Healthier times enhance business capabilities in the form of multi-million dollar, cross organization implementations, while leaner times like these afford only the most critical needs to be fulfilled. So while the volatility wreaks havoc on your organization, one IT spend continues to stay strong. Strategy.

Strategy is a sound investment in prosperous times since the confirmation it provides protects the investment of larger scale initiatives. For example, a company committed to providing better customer service and market additional products into its customer base will undertake a 2 to 3 year set of tactical CRM initiatives. Success factors include the three usual suspects, ‘People, Process and Technology’, and aligning each with an ideal future state vision is critical.

A well executed strategy provides an education for stakeholders and builds consensus among individuals who may have never sat around the same conference room table before. It coalesces and prioritizes goals and objectives, drafts a future state architectural blueprint and describes business processes that will endure, and establishes a long term Road Map that orchestrates incremental efforts and illustrates progress.

So if strategy is a safe bet in better times, why invest in one now?  For executives I’ve met with most recently (Q3 and Q4 2008), a popular form of strategy is analogous to grandma’s attic. At some point, it may have occurred to you to look in grandma’s attic for something that may be useful, and if you’re truly fortunate, there may be something extremely valuable you hadn’t counted on. For C-level executives looking for ways to improve their bottom lines, the same treasure hunt exists in the corporate information they already possess.

To understand whether your enterprise information holds hidden treasures, explore these 10 questions with your organization. Answering ‘No’ or ‘Not sure’ to any questions that have exceptional relevance within your organization may suggest looking into an Enterprise Information Strategy enagement.

  1. Do visionaries within my company have visibility to key performance indicators that drive revenue or lower costs?
  2. Do I understand who my customers are and which products they own?
  3. Am I able to confidently market additional products into my existing customer base?
  4. Do I possess data today that would provide value to complimentary industries through new information based offerings?
  5. Will my information platforms readily scale and integrate to meet the demands of company growth through acquisition?
  6. Am I leveraging the most cost effective RDBMS software and warehouse appliance technologies?
  7. Do I understand the systemic data quality issues that exist in the information I disseminate?
  8. Do the organizations I support understand the reporting limitations of my current state architecture?
  9. Is there an architectural blueprint that illustrates an ideal 2 to 3 year business intelligence future state?
  10. Does my company have visibility to a Road Map that timelines planned projects and the incremental delivery of new business insights?

If Master Data Management is on your agenda

Start with Data Quality

Many organizations are currently working on Master Data Management (MDM) strategies as a core IT initiative. One of the fastest paths to failure for these large, multiyear initiatives is to ignore the quality of the data. This is a good post on other MDM design pitfalls.

Master Data Management (MDM) is defined as the centralization or single view of X (Customer, Product or other reference data) in an enterprise. Wikipedia says: “master data management (MDM) comprises a set of processes and tools that consistently defines and manages non-transactional data entities of an organization (also called reference data).” MDM typically is a large, multiyear initiative with significant investments in tools, with two to five times the investment in labor or services to enable the integration of subscribing and consuming systems. For many companies you are talking millions of dollars over the course of the implementation. According to Forrester, on average, cross-enterprise implementations range anywhere from $500K to $2 million and professional services costs are usually two dollars for every dollar of software license costs. When you consider integration of all your systems for bi-directional synchronization for customer or product information, the services investment over time can be up to five times the license cost.

At its simplest level, MDM is like a centralized data pump or the heart of your customer or product data (the most popular implementations). But once you hook this pump up, if you haven’t taken care of the quality of the data first, what have you done? You have just spent millions of dollars in tools and effort to pollute the quality of data across the entire organization.

Unless you profile the systems to be integrated, the quality of the data is impossible to quantify. The analysts who work with the data in a particular system have an idea of what areas are suspect (e.g., “we don’t put much weight in the forecast of X because we know the data is sourced from our legacy distribution system which has data ‘problems’ or ‘inconsistencies’”). The problem is that the issues are known at the subconscious level but are never quantified, which means a business case to fix the issues never materializes or gets funding to make improvements. In many cases, the business is not aware there is a problem until they try to mine a data source for business intelligence.
According to a study by the Standish Group, 83% of data integration/migration projects fail or overrun substantially due to a lack of understanding of the data and its quality. Anyone ever work on a data integration project or data mart or data warehouse that ran long? I have, and I’m sure most of the people reading this have too.

The good news is that data profiling and analyzing is a small step you can undertake now to prepare and position yourself for the larger MDM effort. With the right tools, you can assess the quality of the data in your most important data sources in as little as three weeks depending upon the number of tables and attributes. Further, it is an inexpensive way to ensure that you are laying the foundation for your MDM or Business Intelligence initiatives. It is much more expensive to uncover your data quality problems in user acceptance testing. Many times it is fatal.

Success of your MDM initiative depends on the quality of the data – you can profile and quantify your data quality issues now to proactively head off problems down the road and build a business case to implement improvement in your existing data assets (marts, warehouses and transactional systems). The byproduct of this analysis is that you can improve the quality of the business intelligence derived from these systems and help the business make better decisions with accurate information.