Beware What Lies Beyond the Valley of Despair

If you’ve implemented an ERP system in the last few decades, you have surely seen one of the many representations of the traditional ERP change management curve, with copious advice for avoiding, or reducing the depth of the Valley of Despair. The graph is somewhat misleading, in that it typically ends with a plateau or pinnacle of success, implying that your troubles are over as soon as you go live.

If only that were true.

A more comprehensive graph would look like this:

DESERT OF DIS

Notice the descent into what I will refer to as the Desert of Disillusionment, that awful place where every “productivity improvement” line item in your rosy ROI analysis (the one that you used to justify the project)  is revealed as a mirage.

Why does this happen, and does it have to be this way?  More importantly, what are the warning signs and how should you deal with them?  We will deal with specific topics in future posts, but for now, we invite you to take our short survey on diagnosing enterprise system impact on business productivity.

take the survey

Are you Paralyzed by a Hoard of Big Data?

Lured by the promise of big data benefits, many organizations are leveraging cheap storage to hoard vast amounts of structured and unstructured data. Without a clear framework for big data governance and use, businesses run the risk of becoming paralyzed under an unorganized jumble of data, much of which has become stale and past its expiration date. Stale data is toxic to your business – it could lead you into taking the wrong action based on data that is no longer relevant.

You know there’s valuable stuff in there, but the thought of wading through all THAT to find it stops you dead in your tracks.  There goes your goal of business process improvement, which according to a recent Informatica survey, most businesses cite as their number one Big Data Initiative goal.

Just as the individual hoarder often requires a professional organizer to help them pare the hoard and institute acquisition and retention rules for preventing hoard-induced paralysis in the future, organizations should seek outside help when they find themselves unable to turn their data hoard into actionable information.

An effective big data strategy needs to include the following components:

  1. An appropriate toolset for analyzing big data and making it actionable by the right people. Avoid building an ivory tower big data bureaucracy, and remember, insight has to turn into action.
  2. A clear and flexible framework, such as social master data management, for integrating big data with enterprise applications, one that can quickly leverage new sources of information about your customers and your market.
  3. Information lifecycle management rules and practices, so that insight and action will be taken based on relevant, as opposed to stale  information.
  4. Consideration of how the enterprise application portfolio might need to be refined to maximize the availability and relevance of big data. In today’s world, that will involve grappling with the flow of information between cloud and internally hosted applications as well.
  5. Comprehensive data security framework that defines who is entitled to use the data, change the data and delete the data, as well as encryption requirements as well as any required upgrades in network security.

Get the picture? Your big data strategy isn’t just a data strategy. It has to be a comprehensive technology-process-people strategy.

All of these elements, should of course, be considered when building your big data business case, and estimating return on investment.

10 Actionable Web Metrics You Can Use – Part 2

Show your analytics results with gauges

In Part 1 of this post, I discussed 5 percentage-based metrics that can provide actionable insight. In Part 2, I will go over 5 index-based metrics that can also provide insight to problems that may need to be addressed in order to maximize the value of your website.

1. Campaign Quality Index (CQI)

This index measures how well targeted your campaigns are at driving qualified traffic to your site. Suppose 40% of your traffic comes from a particular campaign, but the traffic only provides 20% of your overall conversions. The CQI for this campaign would be the percent of conversions from the campaign (20%), divided by the percent of visits from the campaign (40%). A value of one means that a visitor from this campaign is as likely to convert (purchase, sign up, request information, etc…) as from any other campaign. A value less than 1.0 means they are less likely to convert, while a value greater than one means they are more likely to convert. If the value is less than 1.0, then you need look at the reasons. You can break this down to individual search engines, or even keyword groups for each search engine, and for each individual banner campaign or other paid campaign you use, including referral partners. Perhaps the targeting is not sufficiently narrow, or the message is not being carried through the site (high bounce rate). You will want to work with your SEM team and landing page design team to make the needed changes. When you make improvements, you can track their effectiveness by watching the index change. Ideally, your analytics dashboard should be created so that you can see the changes over periods of time.

2. New Customer Index (NCI)

This index is focused on transactions (not revenue) from new customers. It is defined as the percent of transactions from new visitors divided by the site percentage of new visitors. For example, if 40% of your transactions are from new visitors, and 60% of your traffic is from new visitors, your New Customer Index is 0.67. A value of 1.0 means that a purchase is equally likely to come from a new or returning customer. A value less than one (as in this example), means that a new visitor is less likely to become a customer. A value greater than one means that a new visitor is more likely to become a customer than a returning visitor. Your goal is to strive for a value of one or better. If the value is less than one, you will need to look at factors that contribute to a low value. To do this properly, you would want to create a New Customer Index for each type of campaign you run, and compare that to those who come to your site from direct entry. A low performing index for paid search or banner campaigns can mean that you are not targeting the correct market, or that your search terms are not correlated to those looking to purchase your product or service. If the campaign is a banner campaign, either the message is not on target, or the media partner you are using is not attracting the correct demographic.

3. Return Visitor Index (RVI)

This index is simply defined as the percent of return visitors divided by the percent of new visitors. A value of 1.0 means that your site has an equal distribution of new vs. return visitors. A value greater than 1.0 means that your site is more likely to attract return visitors, while a value less than 1.0 means your site is more likely to attract new visitors. Depending on your type of site and your effort on attracting new visitors or keeping existing visitors, you can see how effective your efforts are and can then focus on how to improve this index. If your goal is to encourage repeat visits, then you need to be concerned with how fresh or relevant your content is, or how effective any email campaigns are in getting registered visitors to come back to your site. Any anomalies need to be investigated. As an example, I once saw a huge jump in new traffic in a client’s site that was the result of an email campaign, according to the analytics report. However, the email campaigns were only to registered visitors, so in order to have received the email, you would have first had to have visited the site. Thus, the email campaign visits should show up as return visitors. What happened is that the email contained an offer for a free exercise DVD, and the link URL was hijacked and placed on a few deal sites. When visitors clicked on the link, they were attributed to the email campaign, as the link contained the email campaign code! By looking at the RVI, I was able to see that there was an issue that needed to be addressed.

4. Branded Search Index (BSI)

Organic search can consist of generic terms that relate to content on your site plus searches that include your company name or your brand name.  Each can be of interest to your search manager. If more visitors come to your site from generic keywords or terms, it means that your site is well optimized for content. If more of your search visits come from branded terms, it means that more people are finding your site by your brand name instead of from non-branded terms.  You can track this by creating a BSI metric. This is defined as the percent of visits to your site from branded terms divided by visits from non-branded terms. Values greater than 1.0 mean that you are getting more of your traffic from branded terms, while a value less than 1.0 indicate that generic terms are winning the organic search battle. Depending on your search strategy and goals, you can use this information to help adjust your optimization or brand promotional efforts.

5. Site Search Impact (SSI)

Site search is very important for many types of sites. Visitors who come to your site may use site search to help them quickly find what they are looking for. If they find what they want, they may be more likely to continue to reach a goal, such as a purchase or lead submission. If they don’t find what they are looking for, they may just leave the site. The SSI index can tell you the impact your site search has on your revenue. To calculate it, take the per visit revenue from those who use site search, and divide it by the per visit revenue of those who do not use site search. “Per visit” revenue is defined as the total revenue or lead value for the month, divided by the number of visits. If your SSI index is greater than 1.0, this means that your site search is making you money, compared to those who do not use search. If the index is less than 1.0, it means that your site search is costing you money, meaning those who use site search are less likely to either make a purchase or become a lead. This can be the result of not getting desired results from the search, or result pages that don’t satisfy your visitors’ needs. To solve this problem, you would then need to dive deeper into your site search report to identify and correct the issues.

Summary

Hopefully this two-part post on 10 actionable web metrics you can use has given you some insight into how to make your web analytics program more actionable. While some of these metrics are fairly easy to construct, others may require filtering, segmentation, calculated metrics and integration with offline data. Depending on your analytics tool, you may want to use a presentation package like Xcelcius to create and display your gauges and create a dashboard that can be shared with your site’s key stakeholders.

Data Profiling: The BI Grail

In Healthcare analytics, as in analytics for virtually all other businesses, the landscape facing the Operations, Finance, Clinical, and other organizations within the enterprise is almost always populated by a rich variety of systems which are prospective sources for decision support analysis.   I propose that we insert into the discussion some ideas about the inarguable value of, first, data profiling, and second, a proactive data quality effort as part of any such undertaking.

Whether done from the ground up or when the scope of an already successful initial project is envisioned to expand significantly, all data integration/warehousing/business intelligence efforts benefit from the proper application of these disciplines and the actions taken based upon their findings, early, often, and as aggressively as possible.

I like to say sometimes that in data-centric applications, the framework and mechanisms which comprise a solution are actually even more abstract in some respects than traditional OLTP applications because, up to the point at which a dashboard or report is consumed by a user, the entire application virtually IS the data, sans bells, whistles, and widgets which are the more “material” aspects of GUI/OLTP development efforts:

  • Data entry applications, forms, websites, etc. all exist generally outside the reach of the project being undertaken.
  • Many assertions and assumptions are usually made about the quality of that data.
  • Many, if not most, of those turn out not to be true, or at least not entirely accurate, despite the very earnest efforts of all involved.

What this means in terms of risk to the project cannot be overstated.   Because it is largely unknown in most instances it obviously can neither be qualified nor quantified.   It often turns what seems, on the face of it, to be a relatively simple “build machine X” with gear A, chain B, and axle C project into “build machine X” with gear A (with missing teeth), chain B (not missing any links but definitely rusty and needing some polishing), and axle C (which turns out not to even exist though it is much discussed, maligned, or even praised depending upon who is in the room and how big the company is).

Enter The Grail.   If there is a Grail in data integration and business intelligence, it may well be data profiling and quality management, on its own or as a precursor to true Master Data Management (if that hasn’t already become a forbidden term for your organization due to past failed tries at it).

Data Profiling gives us a pre-emptive strike against our preconceived notions about the quality and content of our data.   It gives us not only quantifiable metrics by which to measure and modify our judgement of the task before us, but frequently results in various business units spinning off immediately into the scramble to improve upon what they honestly did not realize was so flawed.

Data Quality efforts, following comprehensive profiling and any proactive quality correction which is possible, give a project the possibility of fixing problems without changing source systems per se, but before the business intelligence solution becomes either a burned out husk on the side of the EPM highway (failed because of poor data), or at the least a de facto data profiling tool in its own right, by coughing out whatever data doesn’t work instead of serving its intended purpose- to deliver key business performance information based upon a solid data foundation in which all have confidence.

The return on investment for such an effort is measurable, sustainable, and so compelling as an argument that no serious BI undertaking, large or small, should go forward without it.   Whether in Healthcare, Financial Services, Manufacturing, or another vertical,  its value is, I submit, inarguable.

Understanding Multichannel Analytics

While web analytics can give you a pretty accurate picture of how well online buyers respond to online marketing activities, it fails to tell you anything about how your online marketing affects offline purchase behavior and how offline marketing affects online behavior. If you website has a 3% conversion rate, what about the remaining 97% of your visitors? If you send out 50,000 coupons and get a 2% direct response rate, what about the other 98% of those who got the coupons? Is there a way to measure what they do? Enter multichannel analytics.  Multichannel analytics is a process where all marketing channels are analyzed to develop a more complete view of visitor behavior.

The Four Marketing / Purchase Quadrants

While there are four quadrants of multichannel analytics as outlined in the figure on the right, this post will discuss the two online/offline combinations shown in red. I will briefly explain some of the issues regarding multichannel analytics, some methods of tagging offline marketing and offline purchases, and show you some of the benefits.

The biggest problem with tying in offline efforts or offline conversions is lack of a common point between the two. You have two different databases, one of online data and one of offline data. Unless you have the equivalent of a primary key, you cannot join the two data sets together. Imagine a customer walking into your store or calling your order link and giving you their unique visitor cookie. That would make it fairly easy to tie in their online behavior to their offline purchase. You would be able to track what brought them to your website and what they did before coming to your store.  Unfortunately, in the real world we cannot tie these efforts together, so we need to develop solutions. Solutions for both of the red quadrants will be discussed as they relate to the multichannel analytics integration process, as shown in the following figure:

Tracking Offline Marketing to Online Purchases

There are two solutions to tracking your offline marketing efforts. The first solution is to use vanity URLs in your offline marketing efforts. For example, if you go to DellRadio.com, you will be redirected to a dell.com URL that has some tracking code. In the URL string, you will see a parameter titled “cid”, which is used by SiteCatalyst as a campaign ID. Thus, any purchases from visits to DellRadio.com will be credited to their radio campaign.

You can do the same thing with all of your offline efforts. Put vanity URLs on your newspaper or magazine ads, in your mailers and coupons, on billboards and other forms of display advertisements. Use specific vanity URLs in your radio and TV ads, and simply have your IT department do a “301 redirect” that converts these vanity URLs into coded mainstream URLs that your analytic tool can process.

The second solution to the offline marketing effort is to promote the use of tracking codes in your offline media such as infomercials. Someone watching the infomercial can either call the phone number or order online. If they enter the promo code on the website, you will know that the order was the result of the TV ad. However, what this will not tell you is the percentage of those who came to the site from the infomercial but did NOT buy. If you simply want to allocate revenue to an offline marketing effort, a promotion code will work well with any offline media that drives traffic to your main URL. Within your analytic package, you would tag the code entry as an event, and then look at the revenue that is associated with each event (specific code for each offline activity).

Tracking Online Marketing to Offline Purchases

Now that you have a way to track how your offline efforts work to get visitors to your website, how do you measure what they do when they don’t order online?

Capture Visitor Intent

If your business is both online and retail (physical store), you can measure intent to come to the store by tracking results of your store locator and directions links. By setting these as goals, you can then see what searches were done by visitors who have expressed intent to come to your store. To help capture the buyer while he or she is in the buying mood, some stores like Barnes and Nobles offer the ability to enter a zip code to see if a book of interest is available at a local store. If so, the customer can reserve it online and go pick it up right away. If you can offer this type of service, you need to tag this event so it can capture what brought the customer to the website, and be able to tie in the physical purchase (offline) to the online marketing that resulted in the purchase.

Generate Campaign-Based Coupons for Offline Purchases

It is also possible to have your website generate a unique coupon ID that can be for the particular product that was searched.  By creating an ID that represents marketing segmentation (campaign type, campaign source, media placement, keywords, and so on), you can store this information in both your analytics package and your store database. If you use a campaign translation file for your analytics platform, you will want to include the same campaign ID as a prefix to your coupon. The same coupon concept also applies to service businesses such as insurance, reservations, home and professional service businesses, etc…, where you give the prospective customer a coupon ID that they can use to get a discount. If your business takes orders or inquiries over the phone, you could have your site coded to include the coupon code next to the phone number on all pages. By tracking the redemption of these coupons, you can compute a click-to-store conversion rate, and factor in offline revenue that was attributed to specific online marketing campaigns. This will give you a higher ROI and perhaps provide justification for more web-related investment.

Implement Phone Number- Based Tracking

Unique tracking phone numbers can also be used to measure the impact of your online marketing efforts to offline purchases. A service like Voicestar provides these tools. You can place trackable phone numbers on your site, or use services like “Click to Call” and “Form to Phone” options. Their system has an API that lets you get data right out to your analytics tool and dashboard. Tracking phone calls is very important, as it is human nature to still want to talk to someone on the phone before making a purchase decision. When using a phone tracking service, or even if you have a block of your own phone numbers to use, it is important to not have the phone numbers as a part of the static content. The phone numbers need to be integrated with an algorithm that can associate the phone number with a particular campaign.  To further tie in the visitor to the phone number, a cookie should also be set that relates to the tracking source. Thus, if the visitor leaves the site, and comes back at a later time, the initial campaign that brought him or her to the site will still receive credit for the sale.

The biggest drawback to this type of campaign tracking is that depending on what level of detail you want for your marketing segmentation, you can end up needing dozens or hundreds of phone numbers. This can possibly become expensive and difficult to manage. Instead, you can create a 3 or 4 digit “extension” that is tied to a web-related order number, and when someone calls the number, the phone operator asks for the extension. This has no incremental cost to implement.

Another phone tracking service is offered by Mongoose Metrics. Their service integrates with most web analytics tools to create an automated URL postback after each call is made.  You can perform the same type of analysis, ecommerce conversion and segmentation that you would from any other page to be analyzed. You can see instantly how well your online marketing activities are generating online revenue.

There are many ways to implement phone-based tracking, and they all require integrating your site code with your analytics platform and your backend system.

Utilize Site Surveys to Understand Buying Behavior

Another way to gauge consumer intent is to use online site exit surveys. Companies like iPerceptions, ForSee and others can provide you with surveys that your site visitors can take regarding their online experience. You can ask about the likelihood of them making a purchase offline, and how much their online experience would influence their buying decision. On your online order forms and lead forms, you can also ask the question, “How did you hear about us?” in the form of a drop-down select or radio buttons. Include your offline marketing methods as choices. If the online traffic source is “direct entry”, then you can assign credit for the sale to the way the customer said they heard about your site.

Assign Values to Online Leads

If your business model is to let visitors fill out a form to be contacted by an agent or representative, there are a couple of different ways to tie success (revenue) to a campaign. Some analytic packages let you assign a dollar value to goal conversion pages, such as filling out a request for information form, a pre-application, or other form of customer contact. This dollar value is based on two factors – the average close rate of online leads, and the average dollar value of each deal. For example, if your company closes 15% of all of its leads, and the average deal is worth $500, then the value of each lead is $75 (15% of $500). Thus, your web analytics package can compare that value to the cost associated with generating the lead, and the nature of actions that lead up to it (pages visited, items downloaded, actions taken, and so on). If your analytics tool is set up to give credit to the first campaign touch point (PPC campaign, banner ad, referral site, etc…), you can still assign credit for the lead to the original campaign, even if the visitor does not convert until a later date.

The drawback with this method is that you are dealing with averages as far as the value of a lead. With average lead values, you cannot measure if a particular campaign brings in a higher-value customer than does another campaign. You can, however, get an average picture of how effective your online campaigns are right within your web analytics tool, without having to import any external data. For many organizations, this will provide much more insight than they are already getting about their offline purchases. It does require fine tuning the value you are using as the average lead value, based on your close rates and average dollar value of a new customer.

Track Campaign IDs with Lead Form Submissions

An alternative to this is to create an offline method of tracking online campaigns when a form is submitted. Your campaign code that you use in your web analytics package can be stored in a cookie and submitted as a part of your lead form. If all these leads are entered into a database, the campaign code can also be entered, and later receive credit for an eventual sale. The exact dollar value of the deal can then also be assigned to the campaign, just like for an eCommerce site. The integration of the online and offline data would then need to be done.

Reaping the Benefits of Multichannel Integration

So far, I have touched on some of the ways to “tag” offline marketing activities so they can be read by your web analytics program, and how to tag offline behavior that is due to your online marketing efforts. However, to put it all together requires access to all the data, both online and offline, plus an integration plan that combines strategy, technology, business logic, web analytics data, BI data, implementation, analytics and other disciplines to provide the desired results. One of the benefits of a multichannel analytics integration is that you will be able to obtain actionable insights, such as these (some are industry-specific):

  • Enhanced ROI – Once you are able to assign additional offline revenue to your online marketing efforts and online revenue to your offline marketing efforts, you will see a higher ROI, enabling you to justify additional spending on both your online marketing and other web efforts, such as site testing and optimization.
  • Retail Merchandising Decisions – If your business is retail, your online data can be mined to see what items tend to be purchased together, enabling your retail operation to group these same items together for in-store customers.
  • Upsell Opportunities – If your offline customers tend to respond to particular upsell opportunities when they call in or get called back, you can use this information to target similar online customers or visitors, based on data that can be stored in tracking cookies.
  • Re-marketing Intelligence – If you know what online customers come back to your site to buy later, you can use this knowledge to market similar products or services to your in-house mailing or phone list.
  • Additional Retail Outlets – If you see a significant request for retail outlets in areas that you are not currently serving, you can have the data you need to consider expanding your physical presence.
  • New Promotional Activities – If you know that your online visitors express an interest in finding a store based on looking at particular products that they want right away or that tend to be expensive to ship,  you can create geo-targeted online campaigns that are designed to get more buyers to your store. This can also work well for seasonal or event-driven items (snowstorm, hurricanes, extended deep freeze, etc…), where the need for a product is now, not 7 to 10 days from now. By tracking these click-to-store visitors, you will be able to measure the success of these campaigns.

Hopefully, this post will give you some insight into how multichannel analytics works, some of its challenges, and how it can benefit your organization.

Data Darwinism – Are you on the path to extinction?

Most people are familiar with Darwinism.  We’ve all heard the term survival of the fittest.   There is even a humorous take on the subject with the annual Darwin Awards, given to those individuals who have removed themselves from the gene pool through, shall we say, less than intelligent choices.

Businesses go through ups and downs, transformations, up-sizing/down-sizing, centralization/ decentralization, etc.   In other words, they are trying to adapt to the current and future events in order to grow.   Just as in the animal kingdom, some will survive and dominate, some will not fare as well.   In today’s challenging business environment, while many are trying to merely survive, others are prospering, growing and dominating.  

So what makes the difference between being the king of the jungle or being prey?   The ability to make the right decisions in the face of uncertainty.     This is often easier said than done.   However, at the core of making the best decisions is making sure you have the right data.   That brings us back to the topic at hand:  Data Darwinism.   Data Darwinism can be defined as:

“The practice of using an organization’s data to survive, adapt, compete and innovate in a constantly changing and increasingly competitive business environment.”

When asked to assess where they are on the Data Darwinism continuum, many companies will say that they are at the top of the food chain, that they are very fast at getting data to make decisions, that they don’t see data as a problem, etc.   However, when truly asked to objectively evaluate their situation, they often come up with a very different, and often frightening, picture. 

  It’s as simple as looking at your behavior when dealing with data:

If you find yourself exhibiting more of the behaviors on the left side of the picture above, you might be a candidate for the next Data Darwin Awards.

Check back for the next installment of this series “Data Darwinism – Capabilities that Provide a Competitive Advantage.”

Knowledge-based computing: next generation of business intelligence?

pablopPablo Picasso once said “Computers are useless.  They only give you answers.”  The truth is that computers have to work very hard to provide answers to what appear to be simple questions.  While we are buried in terabytes, petabytes and exobytes of data – answers and information can be very hard to come by, especially information necessary for serious business decisions.   Data must be viewed in context of a subject area to become information, and analytic techniques must be applied to information in order to create knowledge worthy of taking action.  The challenge is getting data into context within a subject area and applying the right analytic techniques to get “real” answers.

Enter Wolfram Alpha, as an “answer” engine.  Once touted as the next generation of search engine, this web application combines free form natural language input, i.e. simple questions, and dynamically computed results.  Behind the scenes, a series of supercomputers provide linguistic analysis (context for both the question and the answer), ten terabytes of curated data that is constantly being updated, dynamic computation using 50,000 types of algorithms and equations, and computed presentation with 5,000+ types of visual and tabular output.  Sound impressive?  It could easily be a glimpse of the next generation of business intelligence and decision-support systems.

Wolfram Alpha lets you input a query that requires data analysis or computation, and it delivers the results for you. It’s “curated” data is specially prepared for computation— data that’s been hand-selected by experts working with Wolfram, who go through steps to make sure the raw data is tagged semantically and is presented unambiguously and precisely enough that it can be used for accurate computation.  Alpha demonstrates the real power of metadata – data about data, and the importance of semantic tags for categorizing data into a context necessary for providing knowledge and, thus, answers.

Wolfram Alpha is not a search engine according to Wolfram Research co-founder Theodore Grey.  It is not a replacement for Google.  He says that Alpha is very, very different from a search engine. “Search engines are like reference librarians,” Grey explained. “Reference librarians are good at finding the book you might need, but they’re useless at interpreting the information for you.”  Alpha takes reams of raw information and performs computations using those data.  It produces pages of new information that have never existed on the Internet. “Search engines can’t find an answer for you that a Web page doesn’t have,” Grey explained.

“It’s been a dream of many people for a long time to have a computer that can answer questions,” said Grey. “A lot of people may think of a search engine as that, but if you think about it, what search engines do is an extreme limited subset of that sort of thing.”  Examples of how Alpha can be used today range from solving difficult math equations to doing genetic analysis, examining the historic earnings of public companies, comparing the gross domestic products of different countries, even measuring the caloric content of a meal you plan to make. You can find out what day of the week it was on your birthday, or show the average temperature in your area going back days, months or years.

Wolfram Alpha would make an “ultimate” business intelligence application by computing over an enterprise data warehouse once the data was properly “curated.”  The ability to create knowledge from data, particularly to create actionable answers is what business executives really expect – not prettier presentations.  The only questions left for Alpha are:

  1. who can curate your data for you, and
  2. how quick can you see Alpha running over your data?

IT Cost Cutting and Revenue Enhancing Projects

scissorsIn the current economic climate the CIOs and IT managers are constantly pushed to “do more with less”. However, blindly following this mantra can be a recipe for disaster. These days IT budgets are getting squeezed and there are fewer resources to go around however, literally trying to “do more with less” is the wrong approach. The “do more” approach implies that IT operations were not running efficiently and there was a lot of fat that could be trimmed — quite often that is simply not the case. It is not always possible to find a person or a piece of hardware that is sitting idle which can be cut from the budget without impacting something. However, in most IT departments there are still a lot of opportunities to save cost. But the “do more with less” mantra’s approach of actually trying to do more with less maybe flawed! Instead the right slogan should be something along the lines of “work smarter” or “smart utilization of shrinking resources”; not exactly catchy but conveys what is really needed.

polar bearWhen the times are tough IT departments tend to hunker down and act like hibernating bears – they reduce all activity (especially new projects) to a minimum and try to ride out the winter, not recognizing the opportunity that a recession brings. A more productive approach is to rethink your IT strategy, initiate new projects that enhance your competitive advantage, cut those that don’t, and reinvigorate the IT department in better alignment with the business needs and a more efficient cost structure. The economic climate and the renewed focus on cost reduction provides the much needed impetus to push new initiatives through that couldn’t be done before. Corporate strategy guru Richard Rumelt says,

“There are only two paths to substantially higher performance, one is through continued new inventions and the other requires exploiting changes in your environment.”

Inventing something substantial and new is not always easy or even possible but as the luck would have it the winds of change is blowing pretty hard these days both in technology and in the business environment. Cloud computing has emerged as a disruptive technology and is changing the way applications are built and deployed. Virtualization is changing the way IT departments buy hardware and build data centers. There is a renewed focus on enterprise wide information systems and emergence of new software and techniques have made business intelligence affordable and easy to deploy. These are all signs of major changes afoot in the IT industry. On the business side of the equation the current economic climate is reshaping the landscape and a new breed of winners and losers is sure to emerge. What is needed is a vision, strategy, and will to capitalize on these opportunities and turn them into competitive advantage. Recently a health care client of ours spent roughly $1 million on a BI and data strategy initiative and realized $5 million in savings in the first year due to increased operational efficiency.
 
Broadly speaking IT initiatives can be evaluated along two dimensions cost efficiency and competitive advantage. Cost efficiency defines a project’s ability to lower the cost structure and help you run operations more efficiently. Projects along the competitive advantage dimension provide greater insight into your business and/or market trends and help you gain an edge on the competition. Quite often projects along this dimension rely on an early mover’s advantage which overtime may turn into a “me too” as the competitors jump aboard the same bandwagon. The life of such a competitive advantage can be extended by superior execution but overtime it will fade – think supply-chain automation that gave Dell its competitive advantage in early years. Therefore such projects should be approached with a sense of urgency as each passing day erodes the potential for higher profits. In this framework each project can be considered to have a component of each dimension and can be plotted along these dimensions to help you prioritize projects that can turn recession into an opportunity for gaining competitive edge. Here are six initiatives that can help you break the IT hibernation, help you lower your cost structure, and gain an edge on the competition:

Figure-1-Categorization-of-

Figure 1: Categorization of IT Projects 

Figure-2-Key-Benefits

In the current economic climate no project can go too far without an ROI justification and calculating ROI for an IT project especially something that does not directly produce revenue can be notoriously hard. While calculating ROI for these projects is beyond the scope of this article I hope to return to this issue soon with templates to help you get through the scrutiny of the CFO’s office. For now I will leave you with the thought that ROI can be thought of in terms three components:

  • A value statement
  • Hard ROI (direct ROI)
  • Soft ROI (indirect ROI)

Each one is progressively harder to calculate and requires additional level of rigor and detail but improves the accuracy of calculation. I hope to discuss this subject in more detail in future blog entries.

Cutting costs should not mean cutting revenue.

0925_mz_skinflint

Image courtesy of BusinessWeek 9/25/08: "AmerisourceBergen's Scrimp-and-Save Dave"

The financial panic of late has caused a lot of attention on cutting costs – from frivolities like pens at customer service counters to headcount – organizations are slowing spending. Bad times force management to review every expense, and in these times obsess with them. Financial peace however has two sides – expense and revenue.

A side effect of cost cutting can be stunted revenue, over both the short and long terms. It is easier to evaluate costs than to uncover revenue opportunities, such as determining  truly profitable offerings and adapting your strategies to maximize sales. Also as difficult to quantify are the true loses in unprofitable transactions, and competitive strategies that can negatively impact your competition.

The answers to many of these questions can be  unearthed from data scattered around an organization, groking customers and instantly shared knowledge between disciplines. For example, by combining:

  • customer survey data;
  • external observations;
  • clues left on web visits;
  • and other correspondence within the corporation;

…an organization can uncover unmet needs to satisfy before the competition, and at reduced investment cost.

When external factors, like a gloomy job outlook, cause customers to change behavior, it is time to use all information at your disposal. Those prospects changing preferences for your offerings can provide golden intelligence about the competition or unmet needs.

Pumping information like this is the heart of business intelligence. Marketing and Sales can uncover the opportunity; however, it is up to the enterprise to determine how to execute a timely offering. Financials, human capital planning, and operations, work in concert to develop the strategy which requires forecasting data, operational statistics and capacity planning data to line up.

A good strategist views all angles, not just reducing cost.

It’s the End of the World As We Know It!

The Holidays are a great for watching “End of the World” shows on the History Channel. They were a great comfort, actually almost encouraging, because all of the prophecies target 2012.  “The Bible Code II”, “The Mayan Prophecies”, and the Big 2012 Special compendium of End of the World scenarios, covering Nostrodamus to obscure German prophets, all agree that 2012 is the big one (Dec 21 to be exact!)  What a relief!, the rest of the news reports are trending to canned goods, shotguns, and gold by the end of the year.  We really have almost a whole 48 months before everything goes bang (I wasn’t ready anyway, procrastination rules!).

Unfortunately, we need to do some IT planning and budgeting for the new year and probably should have some thoughts going out 36 months (after that see the first paragraph).  As I discussed in a prior blog, the reporting, BI/CPM/EPM, and analytics efforts are the strongest priority; followed by rational short cost savings efforts.  All organizations must see where they are heading and keep as much water bailed out of the corporate boat as possible.  Easy call, job done! 

Then again a horrifying thought occurred to me, what if one of these initiatives should fail? (see my nightmares in prior blog posts on Data and Analytics).  I am not saying I’m the Mad Hatter and the CEO is the Red Queen, but my head is feeling a bit loosely attached at the moment.  Management cannot afford a failed project in this environment and neither can the CIO in any company (remember CIO=Career Is Over).

The best way to ensure sucessful project delivery (and guarantee my ringside lawn chair and six-pack at Armageddon in 2012) lies in building on best practice and solid technical architecture.  For example, the most effective architecture is to use a layer of indirection between the CPM application (like Planning & Budgeting) and the source data systems (ERP, Custom transactional).  This layer of indirection would be for data staging, allowing transfer to and from fixed layouts for simplified initial installation and maintenance.  In addition, this staging area would be used for data cleansing and rationalization operations to prevent polluting CPM cubes with uncontrolled errors and changes.  In terms of best practice, libraries and tools should be used in all circumstances to encapsulate knowlege rather than custom procedures or manual operations.  Another best practice is to get procedural control of the Excel and Access jungle of wild and wooley data which stands ready to crash any implementation and cause failure and embarassment to the IT staff (and former CIO).  When systems fail, it is usually a failure of confidence in the validity or timeliness of the information whether presented by dashboard or simple report.

CPM, EPM, and Analytics comprise and convey incredibly refined information and decisions of significant consequence are being made within organizations to restructure and invest based on this information.  The information and decisions are only as good as the underlying data going into them.  So skimping on the proper implementation can put the CIO’s paycheck at serious risk (Ouch!).