Easy Value with FDMEE Reports

Strolling into work sipping coffee, the realization soon hits that information is needed out of Financial Data Quality Management Enterprise Edition (FDMEE) for internal audit.  After logging in to Data Management, what happens?? We freeze!  And the questions begin swirling in our heads:  How do we get data out of FDMEE?  What are the drivers needed to do that?  What tools are needed to write an FDMEE report and from where do we get them?

At this point, it is often easier to evaluate existing reports within the application for what they lack rather than start creating a report from scratch and then modify and/or update them to meet our specific needs.

A Variety of Report Options

FDMEE Reports does not equal Financial Reports. From within the application, there are numerous options available to choose from for reports.  Most of these are updated reports from FDM Classic.  These groups help to focus on and categorize common reports together and provide information on the following:

  1. Audit Reports display all transactions for all locations that compose the balance of a target account
  2. Check Reports provide information on the issues encountered when data load rules are run
  3. Base Trial Balance Reports provide detail on how source data is processed
  4. Listing Reports summarize metadata and settings (such as the import format, or check rule) by the current location
  5. Location Analysis reports provide dimension mapping by the current location
  6. Process Monitor Reports shows locations and their positions within the data conversion process
  7. Variance Reports display source and trial balance accounts for one target account, showing data over two periods or categories
  8. Intersection Reports identify invalid HFM data load intersections

Below is a screen shot of the default FDMEE report groups:FDMEE Reports 1

 

 

 

 

 

 

 

 

Getting Started

While the canned reports are a great start, creating custom reports allows more creativity and only requires the following:

  1. Microsoft Word (2010+)
  2. Oracle BI Publisher 11.1.1.7 or 11.1.1.9
  3. Working knowledge of SQL
  4. Working knowledge of the FDMEE database tables

First, if you do not currently have Microsoft Word installed, this process isn’t going to work.  After confirming your version of Word, navigate to Oracle to download the BI Publisher software. (http://www.oracle.com/technetwork/middleware/bi-publisher/downloads/index-101746.html).

After installing the software, an access toolbar will become available:

FDMEE Reports 2

This is where the good nerdy stuff happens!  You need to write a query, via SQL*Developer or SMSS that can then be dropped into FDMEE to produce an XML.  In FDMEE, the query will produce an XML that contains the first 100 rows when you test/validate.  This XML file is what you can bring into BI Publisher (via Word) to produce your report.  Below is a screen shot of FDMEE-generated download for Word:

FDMEE Reports 3

And YES! FDMEE CAN Accept Inputs

FDMEE has the ability to have many prompts.  The information can be user input or a selection from a drop-down.  This information can be gathered/compiled in multiple smaller report queries or from out-of-the-box drop-downs.  Below is a sample FDMEE report with input parameters:

FDMEE Reports 4

Ample Value

Custom FDMEE Reports can be valuable in many ways.  For example, reports can be written to:

  1. Provide Data Compare analysis for data validation activities
  2. Track how many times an end user has exported data for a specific period
  3. Download the maps for a location to Excel
  4. List all the Journals posted by period and category
  5. List all the maps modification activity by date range
  6. List all the location and category and provides the status of each POV

Each of the report styles listed above has provided valuable information to both auditors as well as the administrators of the FDMEE application.   One of the most valuable reports is the one that permits quick data validations and reconciliations because it helps with COA conversions as well as upgrades to the EPM suite.  Here is a sample of a custom journal listing report:

FDMEE Reports 5

…and a custom FDMEE process monitor report:

FDMEE Reports 6

The Verdict

The possibilities and use of FDMEE for supplemental reporting is not limited to trial-balance analysis, trending, or variance reports. Reports are often created to provide additional valuable information for auditors, data workflow analysis, or external and downstream systems.  In many cases, they are used to provide additional and supplemental detail to IT or Financial auditors.  The verdict:  there is easy value added with variety and simplicity with FDMEE Reports.

Contact us at info@ranzal.com with questions about this product.

When FDM Isn’t an Option…Using Essbase to Map Data

lots-of-arrowsThere are times when you do not have an option of using FDM to do large data mapping exercises prior to loading data into Essbase. There are many techniques for handling large amounts of data mappings in Essbase, I have used the technique oultined here several times for large mappings and it continues to exceed my expectations from a performance and repeatability perspective.

Typically, without FDM or some other ETL process, we would simply use an Essbase load rule to do a “mapping” or a replace. However, there are those times when you need to do a mapping based on multiple members. For example, if account = x and cost center = y then change account to z.

Let’s first start with the dimensionality that is in play based on the example below: Time, Scenario, Type, NOU, Account, Total Hospital, and Charge Code

Dimension Type Members in
Dimension
Members
Stored
Time Dense 395 380
Scenario Dense 13 6
Type Sparse 4 4
NOU Sparse 25 18
Account Sparse 888 454
Total Hospital Sparse 5523 2103
Charge Code Sparse 39385 39385

You then need to be able to identify the logic of where the mapping takes place.  I will want to keep the mapping data segregated from all other data so I will load this to a Mapping scenario (Act_Map).  I load a value of ‘1’ to the appropriate intersection, always level0.  Since the mapping applies to all Period detail I will load to a BegBalance member.  The client will then update this mapping file from a go forward basis based on new mapping combinations.

Here is a sample of what the mapping file looks like that gets loaded into Essbase:
NOU STATUS Revised DEPT ACCT # CDM Data
SLJ   IP            2CC      2         0012013         1
SLJ   IP            2CC      2         0012021         1
SLJ   IP            2CC      2         0012062         1

Here is what it looks like when you do a retrieve.  So for 4410CC->2600427->IP->67->SVM there is a value of 1 and for 4410CC->2600435->IP->67->SVM as well.

Essbase Mapping

The next step in the process is to load the actual data that ultimately needs to be mapped. I will load this data based on the detail and dimensionality I have, again at level0.  In my experience, the data is missing a level of detail (GL account for project based planning, Unit/Stat for charge master detail, etc.). So this data gets loaded to specific “No_Dimension” member via a load rule or a generic member. Again, I load this data to a separate scenario as well (Act_Load).

In the example below you will see I am loading Account detail (67 & 68 in the above screenshot) to the Stat_Load member. The data comes across missing the account detail.

essbase mapping

The final step is to calculate the Actuals scenario based on the two scenarios above. You will see that after we run the calculation, Current Yr Actuals is calculated correctly in that the data resides where it should reside.

essbase mapping

Keeping all the data segregated in different scenarios allows you to easily clear data should anything be wrong with one of the loads, thereby keeping the other datasets intact. This process runs on the entire year in less than 2 minutes and not only performs the calculation but also does an aggregation for the Current Yr Actuals.

Come See Edgewater Ranzal at Kscope11

ODTUG Kscope11 is right around the corner. Kscope11 offers the chance for a full day EPM Symposium on Sunday, plus the opportunity to learn from experts in the EPM and BI fields on a wide range of topics.

Edgewater Ranzal will be well represented at the conference, with our associates presenting eight presentations covering Planning, DRM, EPMA, HFM, and FDM. The sessions that we will be presenting at Kscope11 are summarized below. Each title links to an abstract for the presentation, providing additional details.

Session No. Date Time Room Presenter Title
1 6/27/11 11:15 – 12:15 102C Jeff Richardson Calculation Manager:  The New and Improved Application to Create Planning Business Rules
7 6/28/11 11:15 – 12:15 103C Tony Scalese Planning (or Essbase) and FDM and ERPi Equals Success!
10 6/28/11 4:30 – 5:30 101B Chris Barbieri Security and Auditing in HFM
11 6/29/11 8:30 – 9:30 103A Patrick Lehner Best Practices for Using DRM with EPMA
11 6/29/11 8:30 – 9:30 101B Chris Barbieri Getting Started with Calc Manager for HFM
12 6/29/11 9:45 – 10:45 101B Chris Barbieri Advanced Topics in Calc Manager for HFM
12 6/29/11 9:45 – 10:45 102C John Martin Have it Your Way: Building Planning Hierarchies with EPMA or Outline Load Utility
13 6/29/11 11:15 – 12:15 101B Tony Scalese Maximizing the Value of an EPM Investment with ERPi, FDM, & EPMA
17 6/30/11 8:30 – 9:30 101B Tony Scalese Taking Your FDM Application to the Next Level with Advanced Scripting
18 6/30/11 10:30 – 11:30 101B Peter Fugere IFRS Reporting Within Hyperion Financial Management

In addition to the presentations above, you can catch up with our experts at our booth in the Vendor Showcase.

We look forward to seeing you in Long Beach. If you haven’t already registered, you can do so here.

Can FDM Do That?

Most everyone that uses or has seen a demo of Oracle Hyperion Financial Data Quality Management (“FDM”) knows the basic functionality that FDM provides – mapping, data loading, data validation and financial statement certification.  But FDM is so much more than the basics. 

FDM is a very open architecture product readily supporting advanced customization.  As I highlighted in my last blog post, the workflow process (Import  Validate  Export  Check) can be fully automated and executed on a scheduled basis using either FDM’s internal scheduling component or any Windows batch based scheduling tool that an organization prefers.  But that’s just the tip of the proverbial iceberg. 

Any organization that has recently experienced a Hyperion product upgrade – for example, System 9 to Fusion or Enterprise to HFM – knows the pain of revalidating years of financial data.  This exercise can easily take weeks.  Not only is this process time consuming, it’s tedious and often prone to error.  More importantly, data validation can be one of the biggest risks to a project.  The need to improve seems obvious. 

To address this opportunity, we developed a custom solution that leverages HFM and FDM’s advanced features, custom scripts and batch loading. The benefits are substantial – literally tens of thousands of data points can be validated (and re-validated) in minutes – with 100% data accuracy.  This process is easily extendable not only to other Oracle/Hyperion products like Planning & Essbase but potentially to other data stores.

The benefits of this process may be obvious but let’s take a moment to think about them:

  • 100% Data Accuracy – How valuable is this to your organization in the current economic and financial market climate?  The cost of restated financials is far too great to fathom – potential for government fines, reduced shareholder equity and even loss of one’s job.
  • Shorten Implementation Timelines – How nice would it be for your project to come in on time or early?  Using this solution, you can realistically trim weeks if not months out of a project timeline. 
  • Reduced Implementation Costs – let’s face it, in this economy, every dollar needs to pay dividends.  Whether you choice to leverage a consultant, temp, intern or internal resource to validate your data, there is a cost associated with it.  Reducing the time associated with this activity will reduce your project cost. 

I invite you to check back often as I’ll continue to discuss “outside the box” solutions that can add significant ROI to your FDM investment.   

Contributed by:
Tony Scalese, FDM Design Lead
Hyperion Certified Consultant – HFM
Ranzal & Associates
ascalese@ranzal.com

Business Intelligence Technology Environment – Welcome to the Buffet

Business Intelligence Technology Environment or BITE is my own little tag line and acronym (maybe I should copyright it) to express the host of solutions available in the Business Intelligence application world today. (It could also be used as a verb to describe the plethora of poorly designed solutions… ahh but that is another story.)

My current blog series will be Oracle EPM/BI+ solution centric while remaining Oracle EPM/BI+ application agnostic (now dictionary.com is paying off). I hope that you will enjoy this real life approach to the process of decision making on software solutions interspersed with some genuine tips and tricks of the trade — some that you have seen before and some you have never imagined.

In other words, I hope that you will not find this blog to be represented by my newly coined acronym — BITE.

Rules of conduct while at the Buffet

First we need a definition. Yes a definition! Don’t be afraid, definitions are a good thing, they keep us grounded, they set limits and finally they determine if we are true to our mission. I define BITE as processes, software and goals needed to precisely solution the business data critical to the legal, accounting and business decision needs of a specific entity.

Inventive techno junkies, single tool consultants and one track sales people – CLOSE YOUR EYES / SHEILD YOUR COMPUTERS for this next statement else you might go blind. “Precisely Solution” in the definition of BITE includes the moral imperative of not misusing software for intent other than its design and picking software that fits the current business life cycle of a company. (Those of you with Software Misuse problems, I will be posting a number you can call to get help. Remember the first step is admitting you have a problem.)

The application stack for EPM / BI+; HFM, Essbase (with all its add-on modules), Smart View, OBIE, OBAW, FDM, DRM, ODI and a few products you might not have heard about or you’ve heard about but never assessed for your purposes. NO, NO, No, no folks this is not a software sales blog, it’s a solutions blog and in our solutions toolbox we need to do more than use a single hammer creatively to remain competitive from an efficiency and business life cycle standpoint.

The Personalities in the Buffet Line

Now that we have some parameters (and I know it was painful for you left brainers) by which we can solution, we need some realistic company situations to solution. Let’s start with four companies each different in their business life cycle, staff sizes and demands for a BITE at success. You can email me if you will absolutely die without a very specific company example however, I cannot boil the ocean here in this blog (small ponds are all that will be possible).

Our four companies need to be different to see solutions in the work. Let’s pick a manufacturer, a technology company, a retailer and a commodity group. In my next addition we will outline the companies, their mission, their needs and their resources.

The Workflow Revolution – Changes to Financial Reporting

flatworldThomas Friedman first talked about how globalization impacts business life in The World is Flat.  In this book, he describes the ‘flattening of the world’ as the idea that workers from around the globe could collaborate and work across systems and wide spans of geography.  One specific part of this flattening is a change he refers to as the “quiet revolution in software, transmission protocols” that he calls “the ‘workflow revolution’ because of how it made everyone’s computer and software interoperable.”

I see this amazing transformation offered within financial software today, but many companies don’t completely understand the value or the concepts to implement this approach.

New financial systems today allow for the immediate submission of data.  The best practice applications of these systems allow for the validation, translation and commentary of this submission to be owned by the end users.

When I discuss the applied concept with clients, I speak of this ‘changing conversation.’  Before this workflow revolution, legal entities in remote parts of the globe would prepare financials and fax them, or teletype them, to a corporate office.   A process that was manual, slow and disconnected.

The end users owning the process changes the communication of the business.  The old typical conversation before might have been a submission of some financial data followed by a response that the data is incorrect or incomplete, and then a resubmission – all taking days to complete.  The process was also flawed in that it relied completely on the receiving member being proactive, and finding the errors.  Surprisingly, many companies still use this approach.

The technology exists to solve this problem and provide two major benefits.  First, products today make the validation systematic, hence reliable.  The end user knows immediately if the data is wrong, and can resolve the issues.  The system provide consistency and reliability that cannot be accomplished with people.  Second, the end users can be made aware of potential problems and begin researching proactively.  This proactive approach cuts days from the process and improves data quality.

Within my next blog posting, I will discuss many of the controls I am seeing in these systems like SAP’s BPC and Oracle’s HFM products, and how they improve data quality and speed of reporting.