Happy Birthday Office 365, what’s next?

It sure looks like it’s been around for a lot longer, but office 365 is officially celebrating its 1 year anniversary this week.

It’s true that some aspects of earlier MS cloud effort have been around for 4-5 years under different names like BPOS but the new branding and consumer side were introduced last year and SharePoint online took a huge step forward. So how is it doing?

Not bad according to different reports. 3.5 million Consumers have signed up and 15% of exchange users are in the cloud (6% increase over the last year). Microsoft is clearly betting the farm on cloud and the recent choice of its cloud chief Nadella to be the next CEO is a telling sign.

A recent technical summary at ZDNet and a financial analysis at Seeking Alpha both look very positively on the stability and profitability of this model.

We’ve been using the Microsoft office 365 email for a number of years and SharePoint for the last few months and our experience has been very positive. Our customers have been reporting similar satisfaction levels with the reliability and performance. The main advantages we see are:

  • Reduced IT costs: No need to allocate server or VM’s. No need for redundancy and backups. No need for regular installation of patches and updates and all the testing involved.
  • We invested in putting provisioning processes in place that dramatically reduced the timeframe for creating new sites and reduced administrative effort.
  • Mobile and iPad access through Office Web Apps.
  • Social: the new newsfeed, Yammer integration and Communities bring out of the box enhanced collaboration and social interaction.

Looking ahead, there are definitely some concerns and wish list items I’d like to see Microsoft address for office 365 and SharePoint online:

  • Stronger security and privacy commitments. Not that the NSA would have a problem getting to most information anyway but knowing that all corporate secrets are basically available to them upon request is disquieting. Multinationals may not be willing or legally able to make the jump and trust Microsoft with their data. This can be the biggest obstacle for mass adoption for larger companies. Small to midsize companies may care less.
  • More control. From an IT point of view this is scary. An inhouse server you can test, tweak, add memory to, reboot when needed, and install 3rd party add-ons. You now, control. Giving away the ability to jump in and intervene is hard. Even when Microsoft does deliver reliability and reasonable performance our natural impulse is to try and make it better, tweak, optimize. Not much you can do here. I do hope that Microsoft expands the controls given to customers. It will get a lot of untrusting IT guys a level of comfort that is not there now.
  • Support for Web Content Management. If we are giving up a local SharePoint environment, why force users to have one if they want to take full advantage of SharePoint as a content management tool for public website?
  • Add native migration tools. Not that I have anything against our partners the migration tool makers but releasing a platform with no out of the box method of upgrading to it was very odd and the fact no support has been offered since is disappointing. Makes the natural audience of smaller to mid-size businesses with an additional expense to migrate.
  • Cleaner social toolset. I wrote about it earlier in the year, that the Yammer acquisition created some confusion among users. The promised SSO is still outstanding and the small incremental steps like the one released this week are a little confusing.

Making sense of SharePoint’s Workflow and BPM capabilities

Workflow and BPM often get lumped together but it is important to understand the difference between them if you are to pick the right tool for your enterprise. While it is generally agreed that workflow is for modeling simple sequential processes and BPM solutions are more capable of handling complex tasks the distinction between the two needs to be further sharpened. According to David McCoy of Gartner BPM can be defined as “… a structured approach employing methods, policies, metrics, management practices and software tools to manage and continuously optimize an organization’s activities and processes.” Workflow on the other hand is concerned with tasks and application-specific sequencing of activities through a series of predefined steps, involving a small group of people and/or closely related applications. The distinction between the two is far from crisp and in fact it can be argued that both are part of the same continuum. However, there is a distinct difference in focus and complexity between the two. Here is a chart that attempts to further define the two based on capabilities and task suitability.

According to a recent survey by Forrester, Microsoft and SharePoint came in as #1 among the IT decision makers for use as BPM platform followed by Oracle, SAP, IBM, and a host of other BPM centric companies. Forrester report further notes that despite Microsoft’s best efforts to not position SharePoint as a BPM solution (rather as a collaborative workflow solution); the message does not seem to come across clearly. This confusion seems to thrive due to lack of clear and well-defined goals for business process automation and understanding of capabilities of SharePoint and BPM suites (BPMS).  The Forrester report outlines that SharePoint’s features for supporting true BPM are limited. Most of SharePoint’s capabilities in this arena are founded on Windows workflow foundation (WF). While a custom solution can be developed based on SharePoint and WF API to support BPM like capabilities, such an endeavor is bound to be expensive and brittle. SharePoint shines best when OTB capabilities are leveraged to the maximum and customizations are managed carefully. SharePoint’s workflow, document management and collaboration features can be used to develop robust workflow applications that can simplify and automate document & form centric business processes. SharePoint can also serve as a hub of cross-department and cross-application integration but only at the user interface level. SharePoint does not pretend to act as middleware or an enterprise service bus (ESB) and therefore does not provide any standards based application integration features – tasks best left to dedicated integration platforms or BPM solutions.

The limitations of SharePoint’s built-in workflow and underlying Windows Workflow surface quickly when tested against complexities of true enterprise business process automation scenarios. SharePoint’s workflow processes are constrained by the Site Collection boundaries. Therefore any workflow that needs to span organizational boundaries and as results site collections becomes difficult to manage and brittle. For example if a budget approval process needs to go through the finance department, corporate office and local approvals and if any of these structures use their own Site Collections the workflow process will require custom coding or manual workarounds. This constraint limits SharePoint’s workflow scope to department or local application level. WF processes are also limited to either sequence or state machine patterns. There is also no support for a user who makes a mistake and needs to go back to the previous step during a workflow. Multi-level approvals are also not supported a document needs to be routed back to one of the earlier approvers rather than the author. SharePoint workflows are executable programs and therefore cannot adopt easily at runtime (after instantiation) to changes in the rules that may result from changes in business process environment (e.g. regulation changes, corporate policy change, etc.)

While SharePoint is not ideal for complex business process automation it can certainly be used to get started. If all you organization needs is automation of simple and commonly used business tasks (approvals, document management, simple HR applications, financial approvals, etc.)  that do not require tight integration with other data systems and do not require complex exception processing, modeling, optimization, monitoring, etc., then it is a good candidate for SharePoint workflow. However, if your organization is truly looking into business process automation and business process improvement (BPI) then there are many 3rd party solutions (AgilePoint, Global360, K2, Nintex etc.) that can be layered on top of SharePoint to create a more robust solution. The advantage of a layered solution is that 3rd party vendors are able to leverage Microsoft’s significant investment in ease of use, collaboration and user interface integration capabilities of SharePoint while adding core BPM functionality. Such solutions are also typically less expensive and deploy more quickly than a traditional full-blown BPM solution (depending on the situation).

There two basic flavors of the layered BPM solutions (products that leverage SharePoint’s platform & interface for most interactions). The first flavor of these solutions relies on the underlying WF as their workflow engine. Using WF as the base they have built capabilities that are more advanced than out of the box capabilities of SharePoint. Furthermore they are able to maintain a light footprint by leveraging SharePoint and WF infrastructure. However, they naturally suffer from some of the same shortcomings as WF. The second group of solutions relies on proprietary workflow engines that are not built on top of WF. Such solutions typically have larger footprints since they create their own parallel infrastructure for workflow processing and data storage. Their independent foundation allows them to provide capabilities that are not limited by WF but typically at the cost of additional infrastructure complexity. There is a place for either kind of solution and picking the right tool (SharePoint workflow vs. SP layered BPM vs. dedicated BPM) is a vital cog in any business process automation or improvement endeavor.

However, the story does not end at picking the right tool; in fact it is just getting started. Edgewater recently conducted a case study on the effectiveness of such efforts and found that there is a significant disconnect between popular BPM messaging and the companies deploying such technologies. While ROI is considered to be the holy grail of most IT projects the respondents in the survey noted that “ROI was not the most important factor … “, other areas such as customer satisfaction were more important. Survey also found that while BPM tools are more than capable of modeling complex processes organizations implementing BPM preferred to “start with well-defined process that involved fewer people to get a quick win and buy-in first”. Perhaps the most important finding was that the success or failure or an implementation depends on “solid understanding of the business AND the necessary technical skills to implement BPM; just one won’t work.” Business Process Improvement (BPI) needs to be a continuous learning and optimizing cycle. Picking the right tool is only half the battle, having a clear vision of goals and objectives and how BPM may or may not help achieve those is just as essential.

 

SharePoint 2010 Migration: Options & Planning

Many organizations that are running SharePoint 2003/2007 or other CMS are either actively considering or in the midst of upgrading to SharePoint 2010. In this blog we will look at what is involved in upgrading to SharePoint 2010, various options available for the upgrade, and initial planning that needs to precede the migration.

 There are two basic methods of upgrading/migrating from an older version of SharePoint to SharePoint 2010 that are provided by Microsoft: in-place upgrade and database attach upgrade. In addition, there are numerous third-party tools that can help you migrate content and upgrade to SharePoint 2010 not only from an older version of SharePoint but also from other CMS’. Each method has its own set of benefits depending on the objectives of the migration and specifics of the environment. When selecting a migration path, some of the aspects you may need to consider include:

  • Ability to take the production system offline during the migration
  • Amount of change involved in content and its organization during migration
  • Number of customizations (web parts, themes, meta-data, workflows, etc.)
  • Amount of content being migrated
  • Need to upgrade hardware
  • Need to preserve server farm settings

It is much easier to migrate a clean and lean environment than an environment that is full of obsolete content, unused features and broken customization. Start with cleaning up your existing sites and check for the orphaned sites, lists, web parts, etc. Remove any content that is no longer in use, remove unused features and ensure used features are present and working. Once your existing SharePoint site is in tiptop shape you are ready to plan your migration steps.

Before you put your migration/upgrade in motion you need to understand what migration aspects you can compromise on and hard constraints you have. For example:

  • Can you afford to put your environment in read-only mode for the duration of the upgrade?
  • Does the amount of content you have make it prohibitive to copy it over the network?
  • Do you have a lot of customization that you have to deal with?
  • Are you planning to reorganize or selectively migrate your content?

The answers to these kinds of questions will direct your choice of migration tools. Here is a check list that will help you get organized.


Customizations can have a big impact on how quickly and smoothly your migration goes. Therefore it is important to identify and account for as many of them as possible. PreUpgradeCheck can help but here is a list to help you identify and uncover customizations that can add complexity to your migration efforts.

The Microsoft Clinical Information Feedback Loop

Microsoft recently purchased Rosetta Biosoftware from Merck & Co. for its Amalga Life Science platform; with this move, Microsoft is starting to differentiate itself from its competition by offering its integrated information solutions, which include HealthVault, Amalga UIS and Amalga Life Sciences, to both providers and producers. In its crosshairs are huge budgets available from Pharma for infrastructure solutions for drug R&D and clinical trials. Microsoft is posed to attract a whole new audience of customers from Pharma to integrated health systems that have their own research entities. If done correctly, Microsoft’s new strategy could become a model for improving the efficiency of clinical research, by drastically reducing the most costly resource needed for clinical trials, time.

The current Amalga UIS is fundamentally what I like to call a PDA (no not Public Display of Affection, rather a Patient Data Aggregator). There are three core components that include:

  1. Data Aggregation and Distribution Engine (DADE) – sits on top of healthcare provider sources and listens for HL7 messages; then puts them through transformation and parsing scripts in preparation to be stored in Amalga and sends them to a data store;
  2. Data Store – receives the messages from DADE; is a basic core storage engine and is a database with a set of tables specific to segments within the HL7 messages; and
  3. Front End – a web-based presentation layer that was originally designed for patient level data viewing and has plug in capability to provide more appropriate tools for analysis.

The current needs of data integration seem to be met by this solution, and the high degree of customization that can accommodate an implementation makes it even more attractive. Microsoft’s footprint in healthcare is getting bigger; they must understand, though, that this space has many stakeholders. While addressing all their needs is nearly impossible (just ask our hard working politicians’ trying to pass healthcare reform legislation), the last people they want to alienate are those they’ve already convinced that Amalga is the healthcare platform of the future, most notably some high profile integrated health systems across the country.

Integrated health systems (IHS) often provide a combination of services including care delivery, research, education, and even  their own health plan (think KP, John Hopkins, Geisinger, and Sentara). These entities have a unique opportunity to leverage the MS offerings by creating a continuous feedback loop of information from patient to provider to researcher that improves the quality and accuracy of the data throughout the process. Let’s start with the patient:

  • Patient information in HealthVault – As patient’s progress from being baby boomers (less tech-savvy) to Generation X & Yer’s (tech-hungry), clinical information will no longer be in the sole possession of the doctors. Rather, the demand will be for online, mobile, 24×7 access that is shared and can be updated real-time as health data is gathered by both patients and their doctors. Patients, thus, become a stand-alone data quality tool as they become more comfortable verifying, updating, and changing the information in their medical records.
  • Research information in Amalga Life Sciences – Researchers are all too familiar with the tedious, error-prone process of identifying patients with the correct diagnosis and conditions as candidates for clinical trials. As patients become more empowered with their medical records, they make the segmentation of populations a much simpler process.
  • Clinical information in Amalga UIS – Amalga UIS is a mechanism for driving continuous improvement in clinical care by integrating data across the enterprise. One way to improve care is by incorporating best practices identified through clinical research. The information learned from improved research methods are then implemented directly into the standard delivery of patient care offered by provider institutions.

Amalga feedback loop (2)

The Amalga UIS is currently operational in 12 domestic organizations. Because most of these clients are IHS’ and have research entities, they are in the best position to capitalize on the Amalga Life Sciences offering. These will also be the locations where the ROI MS is hoping will be formulated for less prestigious organizations to eventually imitate. It begs the following question, though, that some of the current customers will ask, “How can the existing components of Amalga Unified Intelligence System (UIS) be leveraged in this new offering to make it attractive to the widest audience possible and more importantly, be affordable?” Well, if you can articulate the argument above, and identify the huge benefits that can come from the Microsoft Feedback Loop, your argument might be easier to make than you think. And don’t forget, this feedback mechanism is built on the fundamental principle that all stakeholders must have the collective groups’ best interest in mind; so don’t forget to share what you find with your neighbor.

Fresh! Content is King

fresh-content-squaresUp until few years ago most companies were satisfied with creating websites that were largely static.  A website designer would organize largely pre-existing content into a collection of content buckets, slick graphics, and flash presentations and a website developer would bring the website into existence. New content would be added when either the old one became obsolete or new products or services were created. This model is essentially one step above the electronic brochure style websites of yesteryear, when companies essentially copied their existing paper brochures to web and called it a website.

In today’s environment of social networking, blogs, and collaboration, static content is not only passé it prevents companies from driving advantage from their internal and external user bases and communities of experts. Fresh and timely content helps drive new traffic to the website and is an effective marketing tool. Unfortunately, most companies do not realize the need for fresh and rapidly evolving content on their website and the role it can play in engaging their customers and prospects. Even companies whose products and services remain largely stable overtime need to think about their websites differently. It is not just a one way medium to push static content outwards, it is in fact one of the most cost-effective mechanisms to engage customers and prospects and turn them into a long-term asset. If you believe that the nature of your business is such that you don’t need to think about using your website to engage your customers and prospects, chances are you haven’t fully explored the possibilities. It may take some effort to figure out creative and effective mechanisms to drive advantage from your ability to create fresh and meaningful content and interactions with your customers and prospects, but the rewards are well worth it. From local doctor’s offices to insurance companies to Fortune 500 companies, all can benefit from large, loyal, and engaged communities of customers and prospects.

However, most likely your existing static content-based website can’t support the type of content and interactions needed to support what we just discussed. If your website infrastructure still relies on IT staff to update the content chances are you won’t be able to morph your website into a hub of fresh and dynamic content that attracts new and repeat visits. The business users or the content creators must be able to update the content easily and as frequently as needed.

Of course, you would want some sort of approval workflow and a content publishing process to manage rapidly changing content. Fortunately there is a category of software that is designed to do just that. Web content management systems (WCMs) such as Drupal, Joomla, Microsoft SharePoint, DotNetNuke, etc., are designed to give business users and content creators control over the ability to update content easily and frequently. In most cases, users can manipulate the content by logging into the administrative version of the website and updating the content in a WYSIWYG environment. Content creation and updates can be brought under customized workflows and approval chains which are quite important in a fast moving environment. WCM systems also boost many other capabilities like:

  • Content Categorization
  • Document Management
  • Delegation
  • Audit Trails
  • Content Creator Grouping
  • Content Templates
  • Discussion Forums
  • Blogs
  • Reviews and Ratings
  • Etc.

Discussion forums and blogs can be used to create vibrant user and expert communities that revolve around your products and services and continuously create new content that keeps customers and prospects coming back to your site. These tools not only provide a mechanism for external parties to contribute new content but also provide a mechanism for them to communicate directly with you about what is important to them. Insights gleaned from such content can be quite valuable in creating new products and services or improving the existing ones.

Now that we’ve talked about the virtues of fresh content and using your website as a two way medium, you are probably wondering if you would be able to afford it. A little known secret about good WCMs is how cost effective they can be. Creating a custom website from scratch can be a very onerous and expansive proposition. However, most well respected WCMs offer out-of-box templates and web components that actually make is much faster and cheaper to build a website if you take advantage of their off-the-shelf goodies. If you are considering investing in an upgrade of your website — even if you are NOT (consider the cost of lost opportunity) investing any money in your website —  it would behoove you to look at the benefits of upgrading your website using a WCM system.

Physicians Insist, Leave No Data Behind

“I want it all.” This sentiment is shared by nearly all of the clinicians we’ve met with, from the largest integrated health systems (IHS) to the smallest physician practices, in reference to what data they want access to once an aggregation solution like a data warehouse is implemented.  From discussions with organizations throughout the country and across care settings, we understand a problem that plagues many of these solutions: the disparity between what clinical users would like and what technical support staff can provide.

For instance, when building a Surgical Data Mart, an IHS can collect standard patient demographics from a number of its transactional systems.  When asked, “which ‘patient weight’ would you like to keep, the one from your OR system (Picis), your registration system (HBOC) or your EMR (Epic)?” and sure enough, the doctors will respond, “all 3”. Unfortunately, the doctors often do not consider the cost and effort associated with providing three versions of the same data element to end consumers before answering, “I want it all”.  And therein lies our theory for accommodating this request: Leave No Data Behind. In support of this principle, we are not alone.

By now you’ve all heard that Microsoft is making a play in healthcare with its Amalga platform. MS will continue its strategy of integrating expertise through acquisition and so far, it seems to be working. MS claims an advantage of Amalga is its ability to store and manage an infinite amount of data associated with a patient encounter, across care settings and over time, for a truly horizontal and vertical view of the patient experience. Simply put, No Data Left Behind.  The other major players (GE, Siemens, Google) are shoring up their offerings through partnerships that highlight the importance of access to and management of huge volumes of clinical and patient data.

pc-with-dataWhy is the concept of No Data Left Behind important? Clinicians have stated emphatically, “we do not know what questions we’ll be expected to answer in 3-5 years, either based on new quality initiatives or regulatory compliance, and therefore we’d like all the raw and unfiltered data we can get.” Additionally, the recent popularity of using clinical dashboards and alerts (or “interventional informatics”) in clinical settings further supports this claim. While alerts can be useful and help prevent errors, decrease cost and improve quality, studies suggest that the accuracy of alerts is critical for clinician acceptance; the type of alert and its placement and integration in the clinical workflow is also very important in determining its usefulness. As mentioned above, many organizations understand the need to accommodate the “I want it all” claim, but few combine this with expertise of the aggregation, presentation, and appropriate distribution of this information for improved decision making and tangible quality, compliance, and bottom-line impacts. Fortunately, there are a few of us who’ve witnessed and collaborated with institutions to help evolve from theory to strategy to solution.

mountais-of-dataProviders must formulate a strategy to capitalize on the mountains of data that will come once the healthcare industry figures out how to integrate technology across its outdated, paper-laden landscape.  Producers and payers must implement the proper technology and processes to consume this data via enterprise performance management front-ends so that the entire value chain becomes more seamless. The emphasis on data presentation (think BI, alerting, and predictive analytics) continues to dominate the headlines and budget requests. Healthcare institutions, though, understand these kinds of advanced analytics require the appropriate clinical and technical expertise for implementation. Organizations, now more than ever, are embarking on this journey. We’ve had the opportunity to help overcome the challenges of siloed systems, latent data, and an incomplete view of the patient experience to help institutions realize the promise of an EMR, the benefits of integrated data sets, and the decision making power of consolidated, timely reporting. None of these initiatives will be successful, though, with incomplete data sets; a successful enterprise data strategy, therefore, always embraces the principle of “No Data Left Behind”.

Open Source Development: Top 5 Things I’ve Always Wanted to Know

I am primarily familiar with Microsoft products and languages, but am interested in learning more about open source.  To summarize, here are the top 5 things I’ve always wanted to know about open source development:

1.  Who do you talk to when you have a support issue regarding an open source product that you’re using?

2.  Are open source programs as secure as proprietary programs and are open source solutions embraced by large corporations even though they don’t have a name like Oracle, Microsoft etc. behind them for support?

3.  What are the most popular open source programs out there?

4.  What’s the best place on the web to learn about the open source solutions out there today?

5.  Are there any known drawbacks to open source versus proprietary software that people should know about before pursuing it just on a cost basis?

Granted, some of these I know the partial answers to, but as someone who doesn’t use open source very readily except for Log4Net, it’s fairly representative of the questions I’d have about open source technologies.