Standardization of Comparative Analytics in Healthcare

A Comprehensive Solution for Value-Based Care

As healthcare providers are quickly consolidating and purchasing smaller health systems, standardization is paramount to enable comparative reporting across organizations or sites that facilitates changing attitudes, decreased costs, and better, more cost effective care. Provider systems need to operate independently using a standardized enterprise system process to effectively make decisions around costs, health outcomes, and patient satisfaction.  Without standardization, the analysis of metrics can require considerable work and time and create issues when comparing like sites since appropriate metrics can mean totally different things at the underlying base member calculation.

A standardized solution is simple – an enterprise-based model that allows data to be shared across systems and applications to facilitate comparative analytics with data integrity:

MH Image 1

Such a solution offers the ability to compare productivity indices across departments against national standards using a standard calculation approach with federated master data across all toolsets, resulting in comparative analytics to drive efficiencies and value-based care:

MH Image 2

Oracle Business Intelligence Cloud Service (BICS) September Update

The latest upgrade for BICS happened last week and, while there are no new end user features, it is now easier to integrate data. New to this version is the ability to connect to JDBC data sources through the Data Sync tool.  This allows customers to set up automated data pulls from Salesforce, Redshift, and Hive among others.  In addition to these connections, Oracle RightNow CRM customers have the ability to pull directly from RightNow reports using Oracle Data Sync.  Finally, connections to on premise databases and BICS can be secured using Secure Socket Layer (SSL) certifications.

After developing a customer script using API calls to pull data from Salesforce, I am excited about the ability to connect directly to Salesforce with Data Sync. Direct connections to the Salesforce database allows you to search and browse for relevant tables and import the definitions with ease:

blog

Once the definitions have been imported, standard querying clauses can create the ability to include only relevant data, perform incremental ETLs, and further manipulate the data.

While there are no new features for end users, this is a powerful update when it comes to data integration. Using APIs to extract data from Salesforce meant that each extraction query had to be written by hand which was time consuming and prone to error.  With these new data extraction processes, BICS implementations and integrating data becomes much faster, furthering the promise of Oracle Cloud technologies.

Avoiding Pitfalls in Designing a Master Data Management (MDM) Architecture

Master Data Management (MDM), as a concept, has drawn a great deal of interest from departments heavily invested in several Business Intelligence (BI) and Enterprise Performance Management (EPM) applications.  MDM promises a utopian management center, a one-stop shop solution for master and metadata of quickly changing BI and EPM systems.  Beyond the easier user interface to make numerous changes, MDM, as a purpose built tool, features the ability to create data governance workflows, auditing, and archiving processes.  From an information technology (IT) perspective the final idea is that the MDM software will act as a central hub for all application administrators and designated business power users.  The tool will seamlessly integrate into a current production process and feed the individual applications based on each systems native format to feed in changes or rebuild structures (think dimension load rules in Essbase).

There are obvious challenges that are expected to come up in an MDM initiative as the goals are often lofty, including agreeing on governance, production processes, user interaction, and workflow.  This doesn’t even take into account the change management challenge of working with multiple departments that almost always have different goals and uses of the tool.  Luckily, these items are generally a well understood reality as a part of the overall effort.  However, the one item that is both thoroughly misunderstood is how the MDM software architecture integrates into an existing production and business process.  Hint, the word “existing” is at the root of this misunderstanding.

The specific misconception I would like to tackle is the expectation that one piece of MDM software acts both as the user interface as well as handles all integration with existing systems. As well, it is perceived to still fit into the same production process from a business side that existed before the elimination of a lot of “boxes and arrows” in the existing process.  I may be contradicting what many ‘sales’ types promise that a MDM product easily integrates into disparate systems and simplifies the architecture.  One thing the ‘sales pitch’ does not clarify is that because of the advantages of the MDM product, a good MDM initiative also includes a re-engineering and tuning effort of the surrounding processes. Oops, they must have forgot that part.

In several recent experiences, the biggest hurdle in gaining operational buy-in during the MDM initiative was centered on the disillusionment that resulted from the recommendation to re-engineer existing integrations as well as adding new ones.  One devil’s advocate reaction summarized the sentiment of this disillusionment perfectly:  “So let me get this straight, we are going to simplify and consolidate our production process by adding additional steps?”  Well in a single word response, “Yes!”

So how is this possible and why is it necessary?  In order to clarify this struggle, the diagram below clearly demarks the MDM tool from the processes that typically happen externally of the tool.  The diagram typically has three states: 

  • Current,
  • Current with MDM; and
  • The eventual MDM goal. 

The specifics of the drawing changes from implementation to implementation but the basic result of the different states illustrates an initial increase in the amount of boxes and arrows, not less.  There are two primary reasons why this is the case:

  1. The MDM initiative actualizes undocumented manual business logic and processes that are often not represented in current state architectures.  After reviewing an often oversimplified current state architecture that a client provides me, my two favorite questions to begin probing for these undocumented secrets is to ask: “Ok, so is this really all there’s to it?” and “Is this always how the production process works?  What happens when <fill in the blank> event fails?”  The answers to these questions have to be key architectural considerations as they almost always are the leading indicators of why the current state struggles.
  2. The scope and charter of the initial MDM initiative is championed by only one or two target systems and therefore the initiative has to minimize changes to upstream systems and processes.

Basic Master Data Management Conceptual Architecture

 

MDM Phase 1 implementations are often striving to “sow the seeds”  of consolidation but end with creating and adjusting current processes resulting in more “pieces” to the architecture due to project charter and scope.  Such an intermediate step is necessary in order to immediately show value, get organizational buy-in, and keep project length to “bites that can be chewed”.  There is nothing wrong with this approach and this state is the reality for the vast majority of initial MDM initiatives.  In fact, several phases for different source/target systems may initially all start out like this!

In future phases, however, the MDM tool becomes a true hub of existing systems and master data integrations are specialized on a per application basis.  Separate management routines of master data (common or not) cease to exist in subscriber (source/target) systems.  The consolidation of business logic continues until all business logic is completely removed from integrations.  The integrations serve only to communicate from system to system.  Additionally, maintenance and error handling business processes and logic are candidates to be consolidated and eliminated from the source and target systems.  It is at this point that the architecture morphs into what the initial MDM concept prescribed a hub and spoke system.

Basic Master Data Management Conceptual Architecture - GoalAcknowledging and accounting for this incremental effort, especially the additional integrations, is a critical step in getting buy in for the MDM initiative as a whole.  From an overall cost perspective, it is not uncommon that these integrations steps can equal the work load of the core MDM tool development.  Even so, the value proposition the MDM tool provides immediately should not be ignored.  In the long run it is always cheaper to correct un-auditable, manual, and error prone processes so they can’t fail or have controlled failure scenarios with auditing and user warnings/guides than it is to incrementally take a hit in user frustration and IT all nighters during the end of every reporting period?  Tie to this the added benefit of beginning to figure out as a company what departmental and application differences exist in a centralized setting instead of conceding that Bob’s going to have to stay another weekend to contrive a way to get all applications in synch again (without obviously ever getting a chance to adjust the business process that created that issue), and it then allows a slow (and often painful) breaking down process in order to support an expandable and dynamic MDM solution.