Standardization of Comparative Analytics in Healthcare

A Comprehensive Solution for Value-Based Care

As healthcare providers are quickly consolidating and purchasing smaller health systems, standardization is paramount to enable comparative reporting across organizations or sites that facilitates changing attitudes, decreased costs, and better, more cost effective care. Provider systems need to operate independently using a standardized enterprise system process to effectively make decisions around costs, health outcomes, and patient satisfaction.  Without standardization, the analysis of metrics can require considerable work and time and create issues when comparing like sites since appropriate metrics can mean totally different things at the underlying base member calculation.

A standardized solution is simple – an enterprise-based model that allows data to be shared across systems and applications to facilitate comparative analytics with data integrity:

MH Image 1

Such a solution offers the ability to compare productivity indices across departments against national standards using a standard calculation approach with federated master data across all toolsets, resulting in comparative analytics to drive efficiencies and value-based care:

MH Image 2

A Comparison of Oracle Business Intelligence, Data Visualization, and Visual Analyzer

We recently authored The Role of Oracle Data Visualizer in the Modern Enterprise in which we had referred to both Data Visualization (DV) and Visual Analyzer (VA) as Data Visualizer.  This post addresses readers’ inquiries about the differences between DV and VA as well as a comparison to that of Oracle Business Intelligence (OBI).  The following sections provide details of the solutions for the OBI and DV/VA products as well as a matrix to compare each solution’s capabilities.  Finally, some use cases for DV/VA projects versus OBI will be outlined.

For the purposes of this post, OBI will be considered the parent solution for both on premise Oracle Business Intelligence solutions (including Enterprise Edition (OBIEE), Foundation Services (BIFS), and Standard Edition (OBSE)) as well as Business Intelligence Cloud Service (BICS). OBI is the platform thousands of Oracle customers have become familiar with to provide robust visualizations and dashboard solutions from nearly any data source.  While the on premise solutions are currently the most mature products, at some point in the future, BICS is expected to become the flagship product for Oracle at which time all features are expected to be available.

Likewise, DV/VA will be used to refer collectively to Visual Analyzer packaged with BICS (VA BICS), Visual Analyzer packaged with OBI 12c (VA 12c), Data Visualization Desktop (DVD), and Data Visualization Cloud Service (DVCS). VA was initially introduced as part of the BICS package, but has since become available as part of OBIEE 12c (the latest on premise version).  DVD was released early in 2016 as a stand-alone product that can be downloaded and installed on a local machine.  Recently, DVCS has been released as the cloud-based version of DVD.  All of these products offer similar data visualization capabilities as OBI but feature significant enhancements to the manner in which users interact with their data.  Compared to OBI, the interface is even more simplified and intuitive to use which is an accomplishment for Oracle considering how easy OBI is to use.  Reusable and business process-centric dashboards are available in DV/VA but are referred to as DV or VA Projects.  Perhaps the most powerful feature is the ability for users to mash up data from different sources (including Excel) to quickly gain insight they might have spent days or weeks manually assembling in Excel or Access.  These mashups can be used to create reusable DV/VA Projects that can be refreshed through new data loads in the source system and by uploading updated Excel spreadsheets into DV/VA.

While the six products mentioned can be grouped nicely into two categories, the following matrix outlines the differences between each product. The following sections will provide some commentary to some of the features.

Table 1

Table 1:  Product Capability Matrix

Advanced Analytics provides integrated statistical capabilities based on the R programming language and includes the following functions:

  • Trendline – This function provides a linear or exponential plot through noisy data to indicate a general pattern or direction for time series data. For instance, while there is a noisy fluctuation of revenue over these three years, a slowly increasing general trend can be detected by the Trendline plot:
Figure 1

Figure 1:  Trendline Analysis

 

  • Clusters – This function attempts to classify scattered data into related groups. Users are able to determine the number of clusters and other grouping attributes. For instance, these clusters were generated using Revenue versus Billed Quantity by Month:
Figure 2

Figure 2:  Cluster Analysis

 

  • Outliers – This function detects exceptions in the sample data. For instance, given the previous scatter plot, four outliers can be detected:
Figure 3

Figure 3:  Outlier Analysis

 

  • Regression – This function is similar to the Trendline function but correlates relationships between two measures and does not require a time series. This is often used to help create or determine forecasts. Using the previous Revenue versus Billed Quantity, the following Regression series can be detected:
Figure 4

Figure 4:  Regression Analysis

 

Insights provide users the ability to embed commentary within DV/VA projects (except for VA 12c). Users take a “snapshot” of their data at a certain intersection and make an Insight comment.  These Insights can then be associated with each other to tell a story about the data and then shared with others or assembled into a presentation.  For those readers familiar with the Hyperion Planning capabilities, Insights are analogous to Cell Comments.  OBI 12c (as well as 11g) offers the ability to write comments back to a relational table; however, this capability is not as flexible or robust as Insights and requires intervention by the BI support team to implement.

Figure 5

Figure 5:  Insights Assembled into a Story

 

Direct connections to a Relational Database Management System (RDBMS) such as an enterprise data warehouse are now possible using some of the DV/VA products. (For the purpose of this post, inserting a semantic or logical layer between the database and user is not considered a direct connection).  For the cloud-based versions (VA BICS and DVCS), only connections to other cloud databases are available while DVD allows users to connect to an on premise or cloud database.  This capability will typically be created and configured either by the IT support team or analysts familiar with the data model of the target data source as well as SQL concepts such as creating joins between relational tables.  (Direct connections using OBI are technically possible; however, they require the users to manually write the SQL to extract the data for their analysis).  Once these connections are created and the correct joins are configured between tables, users can further augment their data with data mashups.  VA 12c currently requires a Subject Area connected to a RDBMS to create projects.

Leveraging OLAP data sources such as Essbase is currently only available in OBI 12c (as well as 11g) and VA 12c. These data sources require that the OLAP cube be exposed as a Subject Area in the Presentation layer (in other words, no direct connection to OLAP data sources).  OBI is considered very mature and offers robust mechanisms for interacting with the cube, including the ability to use drillable hierarchical columns in Analysis.  VA 12c currently exposes a flattened list of hierarchical columns without a drillable hierarchical column.  As with direct connections, users are able to mashup their data with the cubes to create custom data models.

While the capabilities of the DV/VA product set are impressive, the solution currently lacks some key capabilities of OBI Analysis and Dashboards. A few of the most noticeable gaps between the capabilities of DV/VA and OBI Dashboards are the inability to:

  • Create the functional equivalent of Action Links which allows users to drill down or across from an Analysis
  • Schedule and/or deliver reports
  • Customize graphs, charts, and other data visualizations to the extent offered by OBI
  • Create Alerts which can perform conditionally-based actions such as pushing information to users
  • Use drillable hierarchical columns

At this time, OBI should continue to be used as the centerpiece for enterprise-wide analytical solutions that require complex dashboards and other capabilities. DV/VA will be more suited for analysts who need to unify discrete data sources in a repeatable and presentation-friendly format using DV/VA Projects.  As mentioned, DV/VA is even easier to use than OBI which makes it ideal for users who wish to have an analytics tool that rapidly allows them to pull together ad hoc analysis.  As was discussed in The Role of Oracle Data Visualizer in the Modern Enterprise, enterprises that are reaching for new game-changing analytic capabilities should give the DV/VA product set a thorough evaluation.  Oracle releases regular upgrades to the entire DV/VA product set, and we anticipate many of the noted gaps will be closed at some point in the future.

Business Intelligence Technology Environment – Welcome to the Buffet

Business Intelligence Technology Environment or BITE is my own little tag line and acronym (maybe I should copyright it) to express the host of solutions available in the Business Intelligence application world today. (It could also be used as a verb to describe the plethora of poorly designed solutions… ahh but that is another story.)

My current blog series will be Oracle EPM/BI+ solution centric while remaining Oracle EPM/BI+ application agnostic (now dictionary.com is paying off). I hope that you will enjoy this real life approach to the process of decision making on software solutions interspersed with some genuine tips and tricks of the trade — some that you have seen before and some you have never imagined.

In other words, I hope that you will not find this blog to be represented by my newly coined acronym — BITE.

Rules of conduct while at the Buffet

First we need a definition. Yes a definition! Don’t be afraid, definitions are a good thing, they keep us grounded, they set limits and finally they determine if we are true to our mission. I define BITE as processes, software and goals needed to precisely solution the business data critical to the legal, accounting and business decision needs of a specific entity.

Inventive techno junkies, single tool consultants and one track sales people – CLOSE YOUR EYES / SHEILD YOUR COMPUTERS for this next statement else you might go blind. “Precisely Solution” in the definition of BITE includes the moral imperative of not misusing software for intent other than its design and picking software that fits the current business life cycle of a company. (Those of you with Software Misuse problems, I will be posting a number you can call to get help. Remember the first step is admitting you have a problem.)

The application stack for EPM / BI+; HFM, Essbase (with all its add-on modules), Smart View, OBIE, OBAW, FDM, DRM, ODI and a few products you might not have heard about or you’ve heard about but never assessed for your purposes. NO, NO, No, no folks this is not a software sales blog, it’s a solutions blog and in our solutions toolbox we need to do more than use a single hammer creatively to remain competitive from an efficiency and business life cycle standpoint.

The Personalities in the Buffet Line

Now that we have some parameters (and I know it was painful for you left brainers) by which we can solution, we need some realistic company situations to solution. Let’s start with four companies each different in their business life cycle, staff sizes and demands for a BITE at success. You can email me if you will absolutely die without a very specific company example however, I cannot boil the ocean here in this blog (small ponds are all that will be possible).

Our four companies need to be different to see solutions in the work. Let’s pick a manufacturer, a technology company, a retailer and a commodity group. In my next addition we will outline the companies, their mission, their needs and their resources.

Avoiding Pitfalls in Designing a Master Data Management (MDM) Architecture

Master Data Management (MDM), as a concept, has drawn a great deal of interest from departments heavily invested in several Business Intelligence (BI) and Enterprise Performance Management (EPM) applications.  MDM promises a utopian management center, a one-stop shop solution for master and metadata of quickly changing BI and EPM systems.  Beyond the easier user interface to make numerous changes, MDM, as a purpose built tool, features the ability to create data governance workflows, auditing, and archiving processes.  From an information technology (IT) perspective the final idea is that the MDM software will act as a central hub for all application administrators and designated business power users.  The tool will seamlessly integrate into a current production process and feed the individual applications based on each systems native format to feed in changes or rebuild structures (think dimension load rules in Essbase).

There are obvious challenges that are expected to come up in an MDM initiative as the goals are often lofty, including agreeing on governance, production processes, user interaction, and workflow.  This doesn’t even take into account the change management challenge of working with multiple departments that almost always have different goals and uses of the tool.  Luckily, these items are generally a well understood reality as a part of the overall effort.  However, the one item that is both thoroughly misunderstood is how the MDM software architecture integrates into an existing production and business process.  Hint, the word “existing” is at the root of this misunderstanding.

The specific misconception I would like to tackle is the expectation that one piece of MDM software acts both as the user interface as well as handles all integration with existing systems. As well, it is perceived to still fit into the same production process from a business side that existed before the elimination of a lot of “boxes and arrows” in the existing process.  I may be contradicting what many ‘sales’ types promise that a MDM product easily integrates into disparate systems and simplifies the architecture.  One thing the ‘sales pitch’ does not clarify is that because of the advantages of the MDM product, a good MDM initiative also includes a re-engineering and tuning effort of the surrounding processes. Oops, they must have forgot that part.

In several recent experiences, the biggest hurdle in gaining operational buy-in during the MDM initiative was centered on the disillusionment that resulted from the recommendation to re-engineer existing integrations as well as adding new ones.  One devil’s advocate reaction summarized the sentiment of this disillusionment perfectly:  “So let me get this straight, we are going to simplify and consolidate our production process by adding additional steps?”  Well in a single word response, “Yes!”

So how is this possible and why is it necessary?  In order to clarify this struggle, the diagram below clearly demarks the MDM tool from the processes that typically happen externally of the tool.  The diagram typically has three states: 

  • Current,
  • Current with MDM; and
  • The eventual MDM goal. 

The specifics of the drawing changes from implementation to implementation but the basic result of the different states illustrates an initial increase in the amount of boxes and arrows, not less.  There are two primary reasons why this is the case:

  1. The MDM initiative actualizes undocumented manual business logic and processes that are often not represented in current state architectures.  After reviewing an often oversimplified current state architecture that a client provides me, my two favorite questions to begin probing for these undocumented secrets is to ask: “Ok, so is this really all there’s to it?” and “Is this always how the production process works?  What happens when <fill in the blank> event fails?”  The answers to these questions have to be key architectural considerations as they almost always are the leading indicators of why the current state struggles.
  2. The scope and charter of the initial MDM initiative is championed by only one or two target systems and therefore the initiative has to minimize changes to upstream systems and processes.

Basic Master Data Management Conceptual Architecture

 

MDM Phase 1 implementations are often striving to “sow the seeds”  of consolidation but end with creating and adjusting current processes resulting in more “pieces” to the architecture due to project charter and scope.  Such an intermediate step is necessary in order to immediately show value, get organizational buy-in, and keep project length to “bites that can be chewed”.  There is nothing wrong with this approach and this state is the reality for the vast majority of initial MDM initiatives.  In fact, several phases for different source/target systems may initially all start out like this!

In future phases, however, the MDM tool becomes a true hub of existing systems and master data integrations are specialized on a per application basis.  Separate management routines of master data (common or not) cease to exist in subscriber (source/target) systems.  The consolidation of business logic continues until all business logic is completely removed from integrations.  The integrations serve only to communicate from system to system.  Additionally, maintenance and error handling business processes and logic are candidates to be consolidated and eliminated from the source and target systems.  It is at this point that the architecture morphs into what the initial MDM concept prescribed a hub and spoke system.

Basic Master Data Management Conceptual Architecture - GoalAcknowledging and accounting for this incremental effort, especially the additional integrations, is a critical step in getting buy in for the MDM initiative as a whole.  From an overall cost perspective, it is not uncommon that these integrations steps can equal the work load of the core MDM tool development.  Even so, the value proposition the MDM tool provides immediately should not be ignored.  In the long run it is always cheaper to correct un-auditable, manual, and error prone processes so they can’t fail or have controlled failure scenarios with auditing and user warnings/guides than it is to incrementally take a hit in user frustration and IT all nighters during the end of every reporting period?  Tie to this the added benefit of beginning to figure out as a company what departmental and application differences exist in a centralized setting instead of conceding that Bob’s going to have to stay another weekend to contrive a way to get all applications in synch again (without obviously ever getting a chance to adjust the business process that created that issue), and it then allows a slow (and often painful) breaking down process in order to support an expandable and dynamic MDM solution.