Oracle Fusion Cloud EPM – 23.08 Update

EPM Cloud August Update

Test Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 4, 2023.


Production Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 18, 2023.


RELEASE HIGHLIGHTS


HELPFUL INFORMATION

The Oracle Help Center provides access to updated documentation. The updates will be available in the Help Center on Friday, August 4, 2023.

NOTE: Some of the links to new feature documentation included in this readiness document will not work until after the Oracle Help Center update is complete.

Updated documentation is published on the Oracle Help Center on the first Friday of each month, coinciding with the monthly updates to Test environments. Because there is a one-week lag between the publishing of the readiness documents (What’s New and New Feature Summary) and Oracle Help Center updates, some links included in the readiness documents will not work until the Oracle Help Center update is complete.

https://docs.oracle.com/en/cloud/saas/epm-cloud/index.html


FIXED ISSUES AND CONSIDERATIONS

Software issues addressed each month and considerations are posted to a knowledge article on My Oracle Support. Click here to review. You must have a My Oracle Support login to access the article.

NOTE: Fixed issues for EPM Cloud Common components (Smart View for Office, EPM Automate, REST API, Migration, Access Control, Data Management/Data Integration, Reports, Financial Reporting, and Calculation Manager) are available in a separate document on the My Oracle Support “Release Highlights” page.

The full Oracle advisory note can be found here

Enterprise Data Management Series – Part 2

In the first part, we learned an overview of EDMCS. In this part, we will discuss more on “What EDMCS has to offer and How”. Unlike DRM where we have version, hierarchy, and nodes; Oracle has introduced View, Viewpoint, and Data chain. Let us go through the basic structure of EDMCS.

 

Figure 1 EDM Model

Figure 1 EDM Model

 

Enterprise data within each application is grouped as multiple dimensions. Each Dimension has its own Data chain. Registering New application results in the creation of various objects and associated dimensions. An Application consists of connected views, dimensions, and associated viewpoints:

  • The View is a collection of Viewpoints.
  • Viewpoints are where users view and work with application data.
  • Each dimension contains a series of related data objects called data chains, which consist of node types, hierarchy sets, node sets, and viewpoints.

 

The above objects are the building blocks of the EDMCS as shown and explained below.

 

Information Model

Figure 2 Information Model

 

Application

  • An application models each connected system as an application. You can click on Register to create a new application.

 

Application

Figure 3 Application

 

Dimension

  • Enterprise data is grouped as dimensions such as Account, Entity, and Movement.

Figure 4 Dimension

 

Figure 5 Data Chain Flow

 

Node Type

  • Collection of nodes that share a common business purpose, like Department, Entities.
  • Defines Property for Associated nodes. For Example, Product node type can include properties like Name, Description, Cost, etc.

Figure 6 Node Type

 

Hierarchy Set

  • The hierarchy set defines parent-child relationships for nodes. Example Employees to Department or Vehicles rollup to Automobiles etc.
  • It can define own hierarchy sets using different relationships between node types.

Figure 7 Hierarchy Set

 

Node Set

  • Defines a group of nodes available in Viewpoints and consists of hierarchies or lists. Example Hierarchy of Cost Centre or List of Country codes.
  • Node sets are the only group of hierarchy sets that are required in Viewpoints. Consider the below figure where only Marketing and Finance are included, and the Marketing hierarchy excluded.

Figure 8 Node Set

 

Viewpoint

  • Viewpoints are used for managing data like comparing, sharing/mapping, and maintaining a dimension across applications such as viewing a list of accounts or managing a product hierarchy or exporting an entity structure.
  • Viewpoints are organized into one or more views. Each viewpoint uses a node-set and controls how users work with data in that node-set in a specific view.

Figure 9 Viewpoint

 

 View

  • A group of viewpoints such as managing data for a dimension across applications or integrating data from and to an external system.
  • Users can define additional views of their own to view and manage data for specific business purposes.

 

Figure 10 View Dashboard

 

 

Integration Benefits

Oracle has taken a major leap improving Integration in EDMCS. When in DRM, Integration to other Hyperion modules can only be possible through Table, Flat file, or API integration or involving custom code development. EDMCS has introduced various components Adapter like PBCS, FCCS, EBS to help make a connection directly to the respective component. Note: Adapter for some components is yet to be deployed from Oracle. However, you can always integrate using the standard flat file export.

 

Migration made simple

Existing on-premise Data Relationship Management can be migrated to EDMCS. The administrator needs to register DRM application in EDMCS as custom application and then import dimensional structure. Note: Data Relationship Management 11.1.2.4.330 or higher is supported for on-premise to cloud migration.

 

Governance at a different level

Previously, on-premise DRM had a separate Data Relationship Governance (DRG) interface but in EDMCS it included governance as part of an application. In EDMCS, organizations use request workflows to exercise positive control over the processes and methods used by their data stewards and data custodians to create and maintain high-quality enterprise data assets. Workflow stages are similar like Submit, Approve, and Commit. Finally, before committing changes, users can visualize changes and their effect on Hierarchy.

 

Oracle Narrative Reporting – Part 2

Overview of the Report Package

  • EPRCS operates with the “Reporting Package” feature that provides the ability to merge with Microsoft Office data and documents. EPRCS can also be combined with on-premise software, cloud data sources or other Oracle EPM applications.
  • Report packages provide a secure, collaborative, and process-driven approach for defining, authoring, reviewing and publishing financial and management reports.
  • With report packages, one can organize the content, delegate roles to authors and reviewers, manage their collaboration/workflow approvals, and sign-off process to create a structured document.

 

Figure 1 Report Package Features

 

How to create a Report Package

While creating a report package we need to provide the following details:

  • Enter Properties
    In the properties section, we need to provide the Name, Description, Report Type, Style Sample and Save To fields of a report package.

Figure 2 Enter Properties

 

Define Process

Apply the respective development phases and define the timeline for each phase.

 

  • Author Phase
    Once this phase is enabled, click on the Calendar icons to define the following dates: Start Author Phase On, Submit Doclets By and End Author Phase On.

 

Figure 3 Define Process: Author Phase

 

  • Review Phase
    Once the “Review Phase” has been enabled, click on the Calendar icons to define the dates: Start Review On, End Review Cycle 1 On and End Review On.

 

Figure 4 Define Process: Review Phase

 

  • Sign-Off Phase
    Once the “Sign-Off Phase” has been enabled, click on the Calendar icons to define the following dates: Start Sign Off On and End Sign Off On.

Figure 5 Define Process: Sign-Off Phase

 

  • Assign Users
    Next, we need to assign users and groups to the following report package responsibilities such as Owners, Reviewers, Signers, and Viewers.

 

Figure 6 Assign Users

 

  • Define Options
    The last step is to define the options for a report package such as Format Options, Shared Folder, and Doclet Versions.

 

Figure 7 Define Options

 

At last click on “Finish” to complete the report package setup. Next, we will discuss the workflow process.

Collaboration and Workflow of Report Packages

The three phases that a report package includes are the Author, Review and Sign-Off phases. For a report package, one or more of the phases can be selected.

  • Author Phase
    Within this phase content, comments and supporting details are updated to help collaborate with other users. It can be applied to an entire report package, a section, or individual doclets.

 

Figure 8 Author Phase

 

  • Review Phase
    The Review Phase is a review cycle where the reviewers can view the current status of the Doclets and input comments on the drafts through the commentary feature if needed and eventually mark their view as complete.

Figure 9 Review Phase

 

  • Sign-Off Phase
    In the sign-off phase, anyone designated as a signer for that review package formally reviews the fully completed report package one final time and other sign-off or rejects it.

Figure 10 Sign-Off Phase

In the next blog, we will discuss integration/extension of EPRCS with Microsoft office products.

 

Oracle Data Integrator Cloud Service (ODICS) – PART 2

Oracle is one of the prominent leaders in providing comprehensive data integration solutions that includes Oracle Data Integrator Cloud Service (ODICS), Oracle Data Integration Platform Cloud, Oracle Golden Gate, Oracle Enterprise Data Quality, Oracle Enterprise Metadata Management, and Oracle Stream Analytics.  ODICS provides continuous access to timely, reliable and heterogeneous data from both on-site and cloud solutions to support analytical and operational business needs.

 

New key investment areas ensure Oracle Data Integrator Cloud Services continues to support clients during their business growth and transformation process. ODICS introduces new functionality in the following areas:

 

Oracle Object Storage and Oracle Object Storage Classic

  • Oracle Object Storage and Object Storage Classic provide fast, stable and secure cloud storage and now ODICS will integrate Oracle Cloud Infrastructure (OCI) seamlessly with them.
  • ODICS comes with a collection of Knowledge Modules (KMs) that can be used to link to Oracle Object Storage and Object Storage Classic in Mappings and Packages to manage files within the local archive or the Hadoop Distributed File System (HDFS).

 

ODI Object Storage

Figure 1 ODI Object Storage

 

Autonomous Databases

  • ODICS now comes with optimized Loading and Integration Knowledge Modules (KMs) that are certified with Oracle Autonomous databases such as:
    • Oracle Autonomous Data Warehouse Cloud (ADW)
    • Oracle Autonomous Transaction Processing (ATP)
  • ODICS works easily with ADW and ATP to achieve better performance in a fully managed environment that is configured for specific workloads by integrating with Autonomous Databases.
  • Both ADW and ATP use the same set of Knowledge Modules and utilize the updated native integration of Oracle Object Storage and Oracle Object Storage Classic.
  • Additionally, Oracle Data Integrator users can also use native integration between Oracle Autonomous Data Warehouse and Oracle Object Storage to allow fast data transmission to ADW or ATP and simplify the entire loading process.

 

ODI Autonomous Data Warehouse

Figure 2 ODI Autonomous Data Warehouse

 

 

Oracle Enterprise Resource Planning (ERP) Cloud

  • The new release also provides a new Infrastructure and Software Platform for Oracle Enterprise Resource Planning (ERP) Cloud, a suite of cloud apps for accounting, project management, sourcing, risk management, and operations.
  • ODICS works seamlessly into the Oracle Enterprise Resource Management (ERP) platform which allows companies to incorporate their ERP data into their data warehouses, data marts. The native application also lets ODICS customers load data into Oracle’s ERP Cloud.

 

ODICS ERP

Figure 3 ODICS ERP

 

 

In the next post, we will discuss more key features such as Oracle Sales Cloud, Oracle Service Cloud, GIT Offline Support, and SAP Delta Extraction.

 

 

Oracle Data Integrator Cloud Service (ODICS) – PART 1

Oracle is one of the prominent leaders in providing comprehensive data integration solutions that includes Oracle Data Integrator Cloud Service (ODICS), Oracle Data Integration Platform Cloud, Oracle Golden Gate, Oracle Enterprise Data Quality, Oracle Enterprise Metadata Management, and Oracle Stream Analytics.  ODICS provides continuous access to timely, reliable and heterogeneous data from both on-site and cloud solutions to support analytical and operational business needs.

ODICS Overview:

  • ODICS provides high-performance data transformation capabilities with its transparent E-LT architecture and extended support for cloud and big data applications.
  • ODICS supports all the features included in Oracle Data Integrator Enterprise Edition within its’ heterogeneous cloud service.
  • ODICS provides an easy-to-use interface to improve productivity, reduce development costs and decrease the total cost of ownership.
  • Oracle Data Integrator Cloud Platform is fully integrated with Oracle Process as a Service (PaaS) platform, such as Oracle Database Cloud Service, Oracle Database Exadata Cloud Service and/or Oracle Big Data Cloud Service to deliver data needs.
  • ODICS can work with third-party systems as well as Oracle solutions as shown in the below screenshot.

ODI On-Premises Integration with Cloud Services

ODI On-Premises Integration with Cloud Services

 

Cloud E-LT Architecture for High Performance vs Traditional ETL Approach:

  •  Traditional ETL software is based on proprietary engines that execute row by row data transformations, thus limiting performance.
  • We can execute data transformations on the target server by implementing an E-LT architecture based on your existing RDBMS engines and SQL.
  • The E-LT architecture gathers data from different sources, loads into the target and performs transformations using the database power.
  • While utilizing existing environment data infrastructures, Oracle Data Integrator delivers flexibility by using target server for data transformations thereby minimizing network traffic.
  • The new E-LT architecture ensures the highest performance possible.

ODICS ELT vs ETL Architecture Differences

ODICS ELT vs ETL Architecture Differences

 

Oracle Data Integrator Architecture Components:

The Oracle Data Integrator (ODI) architecture components include the below feature sets.

 

ODI SDK Java-based API for run time and scheduling Operations.
ODI Studio Designers’ studio to manage connections, interface designs, development, and automation including scheduling.
ODI Standalone Agent It can be configured in a standalone domain and managed by WebLogic Management Framework.
ODI J2EE:

 

This is the Java EE agent based on the Java EE framework that runs on a WebLogic domain a Managed Server configured in a WebLogic domain. This feature set only comes with Enterprise Installation.
ODI Standalone Agent Template Domain files that are required when Oracle WebLogic Server is not handling your Oracle Data Integrator installation. This feature set is accessible only with the type of Standalone Install.
ODI Console As an alternative to certain features of ODI Studio, we can access the web-based console available to assigned users.
FMW Upgrade This is the upgrade assistant used to upgrade the Oracle Data Integrator version from 11g to 12c.
Repository Creation Utility The Repository Creation Utility (RCU) is used to create database schemas and included with the Standalone Installation type. Enterprise Installation does not include RCU but RCU is included with the installation of Oracle Fusion Middleware infrastructure distribution.

 

ODICS Architecture

ODICS Architecture

 

New / Enhanced Big Data and Cloud Features within ODICS:

 ODICS continues to evolve with technological advancements for Big Data and Cloud Knowledge Modules for better transformations.

Big Data Features:

  • Spark Knowledge Modules (KM) Improvement: The emphasis was on producing high-performance, and easy-to-read code (Spark) instead of handwritten scripts. Spark KMs now leverage the latest features such as Dataframes from Apache Spark 2.x to speed up the ODI processes.
  • Spark KMs support in Knowledge Module Editor: The Spark KMs are now fully supported and can be customized as per specific needs.
  • Hadoop Complex Types Enhancements: ODI enhances its support capability to Apache HDFS and Kafka Architecture.
  • Big Data Configuration Wizard: The Big Data Configuration Wizard is now updated with new templates for the current Cloudera distribution.

Spark KMs In Knowledge Module Editor

Spark KMs In Knowledge Module Editor

 

Cloud Features:

  • RESTful Service Support: ODICS can invoke RESTful Service in Topology configurations that include RESTful Service connectivity, resource URI, methods, and parameters.
  • Business Intelligence Cloud Service (BICS) Knowledge Modules: BICS is now supported out of the box in ODICS.
  •  Connectivity with Salesforce: ODICS is fully certified with Salesforce.com and now includes a JDBC driver for this technology out of the box.

ODI Integration With Salesforce

ODI Integration With Salesforce

 

In the next part, we will focus on more key feature highlights within ODICS.

 

Introduction to Oracle Tax Reporting Cloud

 

 

Tax reporting (from Oracle) is an incredible application to increase the efficiency of the tax function.

With the rise of digital economies, governments across the world are finding ways to tax the digital income generated from their country. OECD is working on details of the digital tax so that companies and governments can work more efficiently. Meanwhile, France/UK are ready to tax digital companies with their own digital taxes.

In this world of uncertainty, governments are looking to increase their tax revenue and making changes to the tax laws accordingly. It is expected that companies should calculate their tax obligation correctly as per the latest tax codes in that jurisdiction.

However, tax functions in many companies are still using Microsoft spreadsheets to prepare tax calculations. These calculations might be prepared at an entity level in a spreadsheet and then sent to a regional or global the tax function using email. Tax experts review tax calculations at group level and if there are issues with the tax calculation of an entity or formula errors, the spreadsheet is sent back to the local team for correcting or updating. There is so much to and fro of the spreadsheets that tax and finance teams can easily lose track of the versions and corrections. Sometimes tax calculation models refer to many linked spreadsheets which creates further complexity when the spreadsheets need to be updated for a new account, changes in legislation or accounting standards. Formula errors can exist in the spreadsheets that are difficult to identify and correct. The other issue is that the tax models may be maintained and updated by one team member. If that team member leaves the company, there is a big risk to the tax process and financial close. These are a few of the pain points, bottlenecks and risks with using spreadsheets to prepare tax calculations.

Tax reporting is a web-based cloud solution, which has inbuilt functionality for:

  • Configurable tax calculation rules
  • Automatic calculation of tax expense, DTA/DTL
  • Approval process
  • Roll forwarding of tax accounts
  • Loading trial balance data
  • Load fixed assets data
  • Currency translations
  • Consolidation
  • Calculate Effective Tax Rate
  • Reports on local/ regional /state / national tax data
  • Produce tax accounting journal entries
  • Country by country reporting
  • Capture supplemental data for tax calculations and additional disclosure
  • Maintained by the tax and finance users

 

 

The tax reporting solution provides tax departments with the ability to meet global tax reporting requirements on an ongoing basis and ensure compliance with changing tax regulations.

We can help with implementing Oracle’s tax reporting solution for your organization and provide guidance on how to get the maximum value out of it.

Oracle Narrative Reporting – Part 1

In this blog post, we will be focusing on EPRCS Security access rights and roles available.

EPRCS Overview

  • Oracle Enterprise Performance Reporting Cloud Service (EPRCS) is a Cloud solution for management and narrative reporting. It provides a secure and integrated solution that offers a collaborative and process-driven approach. Cloud maintenance, patching, and back-ups are managed by Oracle.
  • The workflow process provides collaboration, commentary, and delivery of management reporting through EPRCS objects which are stored in a Library. The library can be organized by folder and managed through security.
  • EPRCS allows users to easily combine both data and narrative content on report objects called Doclets. Doclets are grouped together in a report package. “Doclet” is used to perform check-in and check-out process to manage versions.
  • Extended Microsoft Office tools (Word, PowerPoint, etc.) can be used to provide the output of management reports. This includes an intelligent and intuitive simplified UI that can be accessed via desktop, mobile, and the web.
  • User Roles: Provides user types such as owners, authors, and approvers and role-based security and auditable access on desktop and mobile devices.

Figure 1 EPRCS Architecture

 

 

EPRCS Security

EPRCS enables secure collaboration between users. One can control which users can edit which Doclets. This allows users from various departments and areas to all contribute to the same report package thereby safeguarding sensitive data. Inside EPRCS, security is provided at three levels:

 

Figure 2 EPRCS Security Roles

System-Level Roles:

  • For EPRCS environments two sets of roles are created. One set is for Production and another set is for Pre-Production. Pre-Production allows EPRCS customers to keep security differentiated for testing purposes.
  • Roles can also be combined into groups under My Services via the Custom Roles tab. It is considered best practice for assigning security to users based on groups, rather than individually. The five predefined roles are as below:
Roles Access based on roles
Service Administrators Create and maintain all aspects of the system, except for user management
Reports Administrators Create and manage report packages, management reporting definitions, and Disclosure Management documents
Application Administrators Create and maintain all application artifact’s, such as applications, models, dimensions, and data grants
Library Administrators Create and manage folders including root-level folders
Users The minimum role required to log-in and participate in the reporting cycle, and to view artifacts’ to which the user has access

 

 

Artifact-level Security:

Figure 3 Artifact level Security

 

 

You automatically have permission to edit, delete, and maintain that artifact when you create an artifact (report package, folder, application). You can grant security by users and groups to the artifacts created. Users without access cannot see or access that artifact. Artifacts can be given the following forms of permissions:

 

Roles Application Report Packages Third-party Artifacts or Folders
Administer Y Y Y
Use Y
View  – Y Y
Write  – Y

 

  • Administer: Unrestricted view and change privilege to all artifacts.
  • Write: Enables users to add folder content.
  • View: Enables users to view only the artifact.
  • Use: Enables the user to see the application in the library.

 

Figure 4 Access in Report Package

Figure 5 Access to Application


Data Security:

 

Figure 6 Data level Security

It determines the level of security, in which data access permissions can be granted to users. Data level security can be set through Dimension-based access: either by setting the READ access or NONE access.

Figure 7 Dimension-based access

 

Grant access to parts of data in a model. This grant may be at an individual level or by cross-dimension/intersections.

Figure 8 Data Grant level access

To Summarize:

  • EPRCS is a powerful reporting solution that is secure, collaborative and intuitive meant to complement and combine reporting from various types of technologies.
  • Simplify the report creation and distribution process.
  • Collaborate with content contributors and reviewers.
  • Access through Mobile or desktop – when you want, how you want.
  • Publish book – quality financial and management reports.
  • EPRCS has a wide variety of functions that can take it anywhere in an organization to meet unique reporting needs with its cost-effective pricing and is a great entry point for new cloud customers.

 

 

Part 2 will be on “EPRCS Workflow Setup and Details”: where we will talk about the workflow approvals and sign-off process related to the publishing of the reports.

Enterprise Data Management Series – Part 1

Welcome to our initial post in a series of about the world of Metadata Management, Enterprise Data management provides a new way to manage your data assets. You can manage application artefacts that include properties such as Master data (members that represent data and includes dimensions, hierarchies), Reference Data (such as page drop-downs for ease of filtering in frontend), and Mappings (master data member relationships). Using these pre-built functions, you will be able to track master data changes with ease.

If your business is restructured to align entities such as Accounts, Products, Cost Center and Sales across multiple organizational units, you can create model scenarios, rationalize multiple systems, compare within system etc. You can maintain alternate hierarchies for reporting structures which differ from current ERP system structure. In the case of migrating an application to the cloud, you can define target structures and data mappings to accelerate the process. EDMCS also provides the ability to sync up applications from on-premise to cloud or across cloud applications. The process involves the below four C’s:

  • Collate: The first process involves collecting and combining data sources through the process of application registration, data import, and automation.
    • The registration process refers to establishing application connections with a new data source(s).
    • The Import process refers to the loading of data into the application view(s).
    • Automation is built using the REST API features.

Figure 1 Collate Process: Register

 

Figure 2 Collate Process: Import

 

  • Curate: Curate helps organize the source data through a Request mechanism to help load or change existing data. Request mechanism involves 4 steps: Record, Visualize, Validate and Submit.
    • Record: All appropriate actions are Recorded together under the Request Pane with primary action mentioned first.
    • Visualize: This allows you to visualize all the changes prior to committing them. This enables us to view model changes as per request, then study the impact and make any final modifications to the request. Changes made to a view are visible in unique colours and icons so that you can see which parts of the hierarchy or list were changed and what areas may be affected by the change.
    • Validate: Maintain the integrity of data during data entry with real-time validation that checks for duplicates, shared nodes, data type or lookup violation. You can run validation through the Record session as well.
    • Submit: Upon user validations, you can commit them.

Figure 3 Request Process

 

  • Conform: Conform is the process to help standardize rules and policies through Validation, Compare and Map process tasks.
    • You can run validations between and across viewpoints. Share application data within and across the application to create consistency and build alignment.
    • Compare viewpoints side-by-side and synchronize using simple drag and drop between and across applications.
    • Map nodes across viewpoints to build data maps. Construct an alternate hierarchy across viewpoints using the copy dimension feature. Now the question arises “What is the viewpoint”? Viewpoint is a subset of the nodes for you to work with.

 

Figure 4 Viewpoint and Various Actions

 

  • Consume: Consume defines where you move the changes to a target application. You can achieve this by download, export, and automate.
    • Download a viewpoint to make changes offline or make bulk updates.
    • Export moves changes to other application(s) or sync updated data to external target applications.
    • Using EPM automation, you can load and extract dimensions from and within the EPM cloud application

 

Figure 5 EPMAutomate

 

 

Complete and Connected EPM cloud applications

 

Oracle has raised the bar with its latest complete and integrated cloud enterprise performance management applications, In the latest Open World, Larry Ellison (founder of Oracle) restated the mission statement of oracle to

“help people see data in new ways, discover insights, unlock endless possibilities”.

This shows the importance of data and its usage from the visionary technology leader. Oracle is changing from on-premise software provider to the cloud-oriented company. It is the only company in the business software business, which has offerings in Infrastructure as a service(IaaS), Platform as a service(PaaS) and Software as a service(SaaS). Oracle has rewritten the code of its cloud EPM applications to be optimised for the cloud. If you can control the infrastructure of software, then it gives advantage in terms of fine-tuning the performance and security of software applications. Certainly, Oracle has a competitive advantage on its competitors.

 

Oracle EPM Cloud applications help the users in below-mentioned business processes:

  • Connected Planning: Planning application, Profitability and Cost Management application
  • Comprehensive Financial Close: Account Reconciliation, Financial Consolidations and Close, Tax Reporting
  • Reporting: Narrative reporting application
  • Data Management: Enterprise Data Management

 

 

Let us evaluate Oracle EPM applications on cloud partner selection criteria

 

Cloud application availability and scalability

since oracle has presence and customers across the globe, these applications are available in most of the countries. Scalability is inbuilt into the EPM applications and it has been tested against extreme scenarios. Oracle has used simulated SaaS environment to test the scalability of these applications.

Cloud partner completeness

Oracle has comprehensive cloud applications with breadth and depth of across EPM processes as written above. Additionally, these connected cloud applications are built on a common platform. This can really help the customer in coordination with a single partner and getting the latest innovations across the applications. Example: if user like chatbot functionality in the planning application, then oracle can quickly roll out the same functionality in narrative reporting or enterprise data management.

The strategic focus of the cloud partner:

As part of its strategy, oracle has re-written optimised code for its cloud applications. So, customers can get best in class user experience, and functionality. Oracle is continuously spending dollars on research and development to provide latest machine learning, AI and analytics innovations in its cloud applications.

Cloud partner ecosystem

Being one of the oldest players in the business software market, oracle has established an ecosystem of implementation and support consultants. These consultants were helping clients to solve their problems using on-premise Oracle technology. Now, they are focussing on cloud applications.

Customer focus

As part of continuous development, oracle is collaborating with its customers and on-ground consultants to add new features in these cloud applications.

 

So, oracle provides complete and connected EPM cloud applications, which can be configured as per customer business processes. We know that moving to clouds application and choosing a cloud partner should be a strategic decision. It is best to trust your business to the trusted market leader. We can help with devising your cloud strategy and then implementing that strategy using best in class EPM cloud applications.

Little Known EPM tools: Using JHAT

JHAT tool is another way to automate HFM tasks using a batch file and the HFM API. It is the former HAT updated to be compliant with the last HFM release. JHAT offers the opportunity to use any scheduler to launch HFM tasks and provide better flexibility than Task Flows.

JHAT utility is present here (and the batch file embeds all libraries, paths and other references to execute HFM tasks);

 Drive:\Oracle\Middleware\EPMSystem11R1\products\FinancialManagement\Server\jhat.bat

In this example set, we are going to make use of the power of PowerShell scripting and the functionality provided by JHat to perform an HFM metadata scan.

Running an HFM Metadata Scan using JHat

To begin with, let’s create an external variables file called  PRD_External_Vars.ps1 and define all our required variables here.

You could also create 1 single file and declare the variables at the beginning, but I just like it to keep it separate as it becomes easy to manage and generally a cleaner approach.

 #HFM Variables
 $HFM_user_jh = '"hfmadmin"' #Username to login 
 $HFM_Password_jh = '"p@ssw0rd"' #Password for the user 
 $HFM_Server_jh = '"HFMPrd"' #HFM cluster name 
 $HFM_Application_jh = '"Demo"' #Application Name
 $Delimiter = '";"' #App file delimited 
 $HFMScanMode = '"Replace"' #Use the 'Scan' or 'Replace'

Now that we have all our variables declared, let’s just get it going with the JHat script…

Let’s begin by importing all the variables that we declared above in out JHat script

#Create External Variable File Name (as per the environment)
$dir_ext_var = "mydrive:\mydir\PRD_External_Vars.ps1"

#Retrieve Variables from External Variable File
. $dir_ext_var

While we are at it, let’s also declare few additional variables for the log and properties life.

#Log file to log all the steps being executed by JHat
$HFMBatchLog = "mydrive:\\mydir\\HFMMetadata\\HFM_Metadata_Update_Load.log"

#Location for the properties file that will be created by PowerShell on the fly. This will be used by JHat
$OutPath="mydrive:\mydir\HFMMetadata"

#Location of the jhat file installed on the Financial Management server
$JHatLocation = "D:\Oracle\Middleware\EPMSystem11R1\products\FinancialManagement\Server"

#Location of the properties file. This would be passed as a parameter to the jhat batch
$InputFileLocation = "mydrive:\\mydir\\HFMMetadata\\hfm_md_load.properties"

#Temporary location of the jhat log file
$LogFileLocation_jh = "mydrive:\\mydir\\HFMMetadata\\hfmJH.log"

#Temporary location of the powershell log file
$LogFileLocation_ps = "mydrive:\\mydir\\HFMMetadata\\\hfmPS.log"

#Log file which can be reviewed later after the job execution is completed.
$LogPath_jh="mydrive:\\mydir\\HFMMetadata\\PROD_HFMMDScanJH.log"

#Location of the powershell if there are any errors with powershell execution.
$LogPath_ps="mydrive:\\mydir\\HFMMetadata\\PROD_HFMMDScanPS.log"

#App file that will be used to perform the scan and/or load of HFM metadata (the file can be XML too)
$DimensionFile="mydrive:\\mydir\\HFMMetadata\\HFM_MetadataFile.app"

The next interesting bit is to create the HFM Metadata load properties file on the fly. This file would be used by JHat utility to perform the metadata scan…

What we are doing below is to create a properties file that would be used by JHat to;

  1. Login to the application,
  2. Open a session for the application,
  3. Perform a metadata scan,
  4. Store the output in a log file,
  5. Close the session and
  6. Logout of the application.

#Clear contents of existing .properties file on the fly
Clear-Content $OutPath\hfm_md_load.properties

#Available Functions
#Function Name: Logon - Login to the application
#Function Name: OpenApplication - Open a session to the specified application
#Function Name: LoadMetadata – Scan and/or load HFM metadata into the specified application
#Function Name: CloseApplication – Close the session opened
#Function Name: Logout – Log out of the application

#Create .properties file on the fly
Add-Content -Path $OutPath\hfm_md_load.properties -Value "Logon(""False"","""",$HFM_user_jh,$HFM_Password_jh);"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "OpenApplication($HFM_Server_jh,$HFM_Application_jh);"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "LoadMetadata($DimensionFile,$LogPath_jh,$Delimiter,$HFMScanMode,""True"",""True"",""True"",""True"",""True"",""True"",""True"",""True"",""True"",""True"",""False"",""False"",""False"",""True"");"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "CloseApplication();"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "Logout();"
Add-Content -Path $OutPath\hfm_md_load.properties -Value "End"

#Call Jhat api
#The jhat batch requires the log file location and the inputfile location as the parameter
Start-Process -FilePath "$JHatLocation\jhat.bat" -ArgumentList "-O$LogFileLocation_jh -I$InputFileLocation"

Finally, let’s run the PowerShell now.

Once the execution is complete, checking in Consolidation Administration, we can see that the Metadata load started and completed without any errors.

There are various other functions available with JHat,

Running An InterCompany report using JHat


#Function Name: GenerateReport – To Generate ICP report
#Arg0 = Path (Path of the document in document manager)
#Arg1 = docName (Name of the document)
#Arg2 = reportType (valid options - intercompany, journal, EPU, ICTransactions, IC Match By Account, IC Match by ID)
#Arg3 = reportFormat (HFM_FORMAT)
#Arg4 = reportFile (location of the file where report must be stored)
#Arg5 = overriddenPOV (specify the POV to override it with)

GenerateReport("\\\","Monitoring_REP_Plug_Acct_Matching", "intercompany","HFM_FORMAT","D:\Oracle\Temp\Workspace\Intercompany\InterCompany.html","S#Scenario.Y#2019.P#Jun.W#YTD.V#<Entity Curr Total>.E#{Example.[Base]}");