Cloud Data Management (CDM) and Financial Data Quality Management Enterprise Edition (FDMEE): A Case Study in Working Together

Why buy Financial Data Quality Management Enterprise Edition (FDMEE) when Cloud Data Management (CDM) is free?  As outlined in my recent white paper – FDMEE vs. Cloud Data Management – there are myriad factors that can drive the decision.  This blog post highlights how one customer gained a highly flexible and automated solution for data and master data management with an on-premise deployment of FDMEE in conjunction with Cloud Data Management.

This customer adopted a pure Cloud strategy as it relates to Enterprise Performance Management (EPM) procuring subscriptions to Planning and Budgeting Cloud Service (PBCS), Financial Close & Consolidation Cloud Service (FCCS), and Account Reconciliation Cloud Service (ARCS).  A diverse business, the customer has many unique operational systems with varying formats and charts of accounts.  So far, no reason why Cloud Data Management (CDM) can’t handle this requirement, right?  This is what CDM does – uses import formats and maps to consume and transform data – right?  Sure, but with caveats.  Notice that I used the word consume and not extract.  CDM does not provide the ability to link with on-premise systems to extract data.  Additionally, flat file data extracts that lack a consistent structure often cannot be natively consumed by CDM.

In this case, data needs to be loaded each day from numerous sources to support daily operational reporting.  The systems are a blend of on-premise, hosted, and Cloud applications.  The customer requirement dictated that any on-premise system should be connected directly to eliminate the need for a flat file extract to be generated daily.  Additionally, the hosted and Cloud applications are very industry specific and, in some cases, provided by very niche vendors.  The ability to modify extract formats was cost prohibitive or simply not supported.  As a result, several of these data feeds were not consumable by CDM without preprocessing/modification.

In light of the above requirements, the customer procured and deployed FDMEE on-premise.  The power of FDMEE allows a solution to be deployed that provides a direct connection to multiple on-premise systems as well as consume the flat file extracts from hosted and Cloud applications including Excel files (not in the required FDMEE/CDM format) and XML.  Because FDMEE on-premise supports scripting, we were able to greatly enrich the data integration cycle with full end-to-end automation including FTP downloading of hosted data, enhancement of the data integration cycle to detect data mapped to members not yet in PBCS or FCCS, dynamically setting substitution variables based on the processing day, running calculations in PBCS, and sending email status alerts to outline the success or failure of a data load cycle.

Although I am a huge FDMEE advocate, I recognize the value of Cloud Data Management and the benefits it provides in a case like this one.  This customer was one of just three participants in the Oracle Enterprise Data Management Cloud Service (EDMCS) program.  This means that they were able to use the software before it was publicly available – otherwise known as GA.  To participate in this program, one must recognize the absence of certain features and functions with the software.  The program allows the customer (and partner) to offer Oracle development and product management valuable input about the software and in some ways drive what features are prioritized within the product roadmap.

EDMCS currently lacks native connections to FCCS, but this will change over time.  So how does CDM help with loading metadata to FCCS?  In a recent update to CDM, Oracle included the ability to import a flat file into CDM and load metadata to a registered target application such as PBCS or FCCS.  John Goodwin gives a detailed overview of the technical setup.

FDMEE and CDM have come together in this case to provide a fully automated data integration process and an automated master data integration process.  Within EDMCS, a Custom application type was created.  The required properties for FCCS were built and attached to the multiple dimensions being mastered, and flat file exports were generated for FCCS.  We knew we were going to use CDM to manage the master data load process, but we had a decision to make – do we leverage EPM Automate or FDMEE as our automation hub?

We chose FDMEE.  Why?  Simply because a lot of automation assets had already been developed in FDMEE that could readily be reused for this process including execution of EPM Automate commands, a framework for leveraging the REST API (for PBCS and FCCS), and email alerting.  Additionally, we found the capabilities of EPM Automate to be somewhat limited.

For example, when you execute a CDM data load rule from EPM Automate, the process ID associated with the execution is not returned.  Why is that important?  Because in the event of a failure, I’d want to download the process log and attach it to the email so the user has information to address the issue.  Could I use the ListFiles command of EPM Automate to get the process log? Possibly, but it doesn’t account for potential concurrency, and I am not doing my job as a consultant if I build a process that can’t handle concurrent operations.  For reasons such as these, we leveraged EPM Automate when possible and the REST API as needed, and we wrapped it all together with an FDMEE process that could be executed on a scheduled basis or on demand simply by using the Script Execution functionality.

Let’s review the end-to-end solution.  In EDMCS, metadata is maintained for PBCS and FCCS.  The metadata is extracted to a flat file (.csv) after maintenance is completed and saved to a network folder.  From FDMEE, the master data integration process is initiated to upload the metadata files to FCCS and PBCS.  Cloud Data Management data load rules are initialized to process the metadata extracts.  In the event of an error, the CDM process log is downloaded.  Finally, an email is generated to alert the administrator of the data integration process status.

There you have it – EDMCS, FDMEE, and CDM working in concert to provide a seamless and elegant solution to data and master data integration for a customer that adopted a Cloud EPM strategy.  If you want to learn how you can enhance your Oracle EPM integration processes, contact us and we’ll be happy to discuss your options.

Avoiding Pitfalls in Designing a Master Data Management (MDM) Architecture

Master Data Management (MDM), as a concept, has drawn a great deal of interest from departments heavily invested in several Business Intelligence (BI) and Enterprise Performance Management (EPM) applications.  MDM promises a utopian management center, a one-stop shop solution for master and metadata of quickly changing BI and EPM systems.  Beyond the easier user interface to make numerous changes, MDM, as a purpose built tool, features the ability to create data governance workflows, auditing, and archiving processes.  From an information technology (IT) perspective the final idea is that the MDM software will act as a central hub for all application administrators and designated business power users.  The tool will seamlessly integrate into a current production process and feed the individual applications based on each systems native format to feed in changes or rebuild structures (think dimension load rules in Essbase).

There are obvious challenges that are expected to come up in an MDM initiative as the goals are often lofty, including agreeing on governance, production processes, user interaction, and workflow.  This doesn’t even take into account the change management challenge of working with multiple departments that almost always have different goals and uses of the tool.  Luckily, these items are generally a well understood reality as a part of the overall effort.  However, the one item that is both thoroughly misunderstood is how the MDM software architecture integrates into an existing production and business process.  Hint, the word “existing” is at the root of this misunderstanding.

The specific misconception I would like to tackle is the expectation that one piece of MDM software acts both as the user interface as well as handles all integration with existing systems. As well, it is perceived to still fit into the same production process from a business side that existed before the elimination of a lot of “boxes and arrows” in the existing process.  I may be contradicting what many ‘sales’ types promise that a MDM product easily integrates into disparate systems and simplifies the architecture.  One thing the ‘sales pitch’ does not clarify is that because of the advantages of the MDM product, a good MDM initiative also includes a re-engineering and tuning effort of the surrounding processes. Oops, they must have forgot that part.

In several recent experiences, the biggest hurdle in gaining operational buy-in during the MDM initiative was centered on the disillusionment that resulted from the recommendation to re-engineer existing integrations as well as adding new ones.  One devil’s advocate reaction summarized the sentiment of this disillusionment perfectly:  “So let me get this straight, we are going to simplify and consolidate our production process by adding additional steps?”  Well in a single word response, “Yes!”

So how is this possible and why is it necessary?  In order to clarify this struggle, the diagram below clearly demarks the MDM tool from the processes that typically happen externally of the tool.  The diagram typically has three states: 

  • Current,
  • Current with MDM; and
  • The eventual MDM goal. 

The specifics of the drawing changes from implementation to implementation but the basic result of the different states illustrates an initial increase in the amount of boxes and arrows, not less.  There are two primary reasons why this is the case:

  1. The MDM initiative actualizes undocumented manual business logic and processes that are often not represented in current state architectures.  After reviewing an often oversimplified current state architecture that a client provides me, my two favorite questions to begin probing for these undocumented secrets is to ask: “Ok, so is this really all there’s to it?” and “Is this always how the production process works?  What happens when <fill in the blank> event fails?”  The answers to these questions have to be key architectural considerations as they almost always are the leading indicators of why the current state struggles.
  2. The scope and charter of the initial MDM initiative is championed by only one or two target systems and therefore the initiative has to minimize changes to upstream systems and processes.

Basic Master Data Management Conceptual Architecture

 

MDM Phase 1 implementations are often striving to “sow the seeds”  of consolidation but end with creating and adjusting current processes resulting in more “pieces” to the architecture due to project charter and scope.  Such an intermediate step is necessary in order to immediately show value, get organizational buy-in, and keep project length to “bites that can be chewed”.  There is nothing wrong with this approach and this state is the reality for the vast majority of initial MDM initiatives.  In fact, several phases for different source/target systems may initially all start out like this!

In future phases, however, the MDM tool becomes a true hub of existing systems and master data integrations are specialized on a per application basis.  Separate management routines of master data (common or not) cease to exist in subscriber (source/target) systems.  The consolidation of business logic continues until all business logic is completely removed from integrations.  The integrations serve only to communicate from system to system.  Additionally, maintenance and error handling business processes and logic are candidates to be consolidated and eliminated from the source and target systems.  It is at this point that the architecture morphs into what the initial MDM concept prescribed a hub and spoke system.

Basic Master Data Management Conceptual Architecture - GoalAcknowledging and accounting for this incremental effort, especially the additional integrations, is a critical step in getting buy in for the MDM initiative as a whole.  From an overall cost perspective, it is not uncommon that these integrations steps can equal the work load of the core MDM tool development.  Even so, the value proposition the MDM tool provides immediately should not be ignored.  In the long run it is always cheaper to correct un-auditable, manual, and error prone processes so they can’t fail or have controlled failure scenarios with auditing and user warnings/guides than it is to incrementally take a hit in user frustration and IT all nighters during the end of every reporting period?  Tie to this the added benefit of beginning to figure out as a company what departmental and application differences exist in a centralized setting instead of conceding that Bob’s going to have to stay another weekend to contrive a way to get all applications in synch again (without obviously ever getting a chance to adjust the business process that created that issue), and it then allows a slow (and often painful) breaking down process in order to support an expandable and dynamic MDM solution.