Managing your risk – Why you need a Financial Consolidation system?

It is continually surprising to observe the number of ASX-listed organizations lacking robust systems to generate consolidated financial statements and notes to the accounts. Directors of companies are obligated to exercise due care and diligence under the Corporations Act 2001, which includes financial reporting responsibilities. Financial statements and notes to accounts are required to adhere to accounting standards and accurately reflect the company’s consolidated financial position and performance.

Many organizations depend on Excel as the primary tool for preparing statutory consolidated financial statements. Although Excel is a useful tool, it is often not the most suitable system for producing consolidated financial reports. Excel is not a robust option, and numerous well-documented cases in Australia and abroad have shown that errors in Excel have led to significant financial reporting mistakes.

Dedicated applications designed for preparing consolidated financial accounts come with standard features and functionalities that enable financial accounting teams to prepare the accounts confidently. Without such an application, Financial Controllers or CFOs may encounter several challenges when producing consolidated financial accounts in Excel, including:

Reporting Challenges of Excel

  • Multiple General Ledger Systems – It is very common that an organisation has multiple General Ledger systems, each containing its own unique Chart of Accounts (CoA). These Trial Balances need to be extracted and mapped to a common CoA.  These ledger systems are typically always changing, for example the addition of new accounts or legal entities.  Financial consolidation applications have tools designed for accountants to manage this CoA data mapping process.
  • Currency Conversion – Many organisations have legal entities that report in other currencies and these need to be translated into AUD for consolidation purposes. Financial consolidation applications provide standard functionality to perform this currency conversion and report in multiple reporting currencies, for example AUD and USD.
  • Preparation of Notes to Accounts – The notes are used to make important disclosures that explain the numbers in the financial statements of a company. Common notes to the financial statements include accounting policies, depreciation of assets, inventory valuation, subsequent events, etc. These notes are typical complex to prepare and often require data from a variety of different data sources.
  • Intercompany Eliminations – Excel does not automatically post intercompany elimination and consolidation journals. In Excel, there is no standard ability to produce a mismatch report or maintain an audit trail of changes made to your data.
  • Data Validation – Most consolidation applications have multiple layers of data validation in order to minimise reconciliation issues. If Excel is used, it is difficult to achieve a similar level of the checks and balances required to have complete confidence in your financial reports.
  • Partial Ownership – Many companies have various subsidiaries/entities that are not fully consolidated into financial statements. Most consolidation applications have the ability to apply appropriate accounting equity rules to ensure that entities are correctly consolidated. Excel typically does not have this functionality.
  • Workflow – Financial consolidation reports are typically prepared by a team of financial accountants. Workflow is difficult to manage in Excel as Excel files need to be distributed to various stakeholders and you cannot control last minute adjustments and changes.  A dedicated consolidation application not only improves the efficiency and workflow of staff to assist with closing financial accounts, but can also ensure last minute or post close adjustments are not made without approval.
  • Security & Audit Features – When it comes to accountability, tracking user actions is crucial. This cannot be achieved in Excel.
  • Financial Statement Reporting – When preparing the financial statements there are typically many different views of the data that need to be represented. This may be by Product Groups, or Regions.  A dedicated consolidation application can make this data easy to present in a variety of reporting views.

Reporting Challenges of only using an ERP

Can you implement a financial consolidation system within ERP applications?  Some ERP applications contain some basic consolidation functionality, however there are a number of typical shortcomings with this approach:

  • Mergers, Acquisitions & Disposals – often many large companies are undertaking M&A activities. If you have just made a large acquisition, how will you produce your financial consolidated statements?  Your ultimate goal might be to move the organisation to your core ERP, however these projects can typically take many months or years.  Consolidation applications have tools that allow you to easily extract the Trial Balance from the General Ledger of a new organisation and map the CoA to your common consolidated financial reporting CoA.  Compared to an ERP migration, this can be done in a very short time frame.
  • Notes to the Accounts – Some ERP applications provide functionality to post elimination journal entries, but they do not provide a facility to create notes to the accounts. Again, we too often see organisations attempting to perform this task within Excel with its associated short comings.
  • Flexibility – reporting requirements and accounting standards are continually changing and evolving. Typically, ERP applications are not very agile environments and change is often difficult, time consuming, and costly to implement.
  • Costs – We are often surprised by the large investments that organisations make to enable financial consolidation within an ERP application. An application designed specifically for financial consolidations can be implemented for a fraction of the cost.

 

Specific financial consolidation applications are not costly to implement or maintain. Such applications can significantly lower audit fees for an organization. When standard end-of-year journal entries are utilized within these applications, they usually require only a single audit and approval by external auditors. In contrast, when spreadsheets are employed, they often necessitate a full audit annually.

As a Financial Controller, CFO, or Director of a major ASX listed company, I would avoid risking my personal or the organization’s reputation by relying on Excel for producing consolidated financial reports. The stakes are simply too great.


DAMIAN TIMMS

 

EPM Data Integration – Pipeline

 

Case Study

A client expressed the need for an interface with functionality that would allow non-technical professionals to run daily batches. These batches could include tasks like pulling Actuals, updating the Chart of Accounts (CoA), or refreshing the Cost Centre structure and running the business rules, among others, from the source System.

While seeking a solution, we explored numerous alternatives within Data Integration. However, the challenge emerged as several intricate steps were involved, necessitating individuals to possess a certain level of technical understanding of the Data Integration tool.

Solution

Exciting developments ensued when Oracle introduced a new feature known as “Pipeline.”

 


Pipeline in Data Integration

This innovative addition empowers users to seamlessly orchestrate a sequence of jobs as a unified process. Moreover, the Pipeline feature facilitates the orchestration of Oracle Enterprise Performance Management Cloud jobs across instances, all from a single centralized location.

By leveraging the power of the Pipeline, you can gain enhanced control and visibility throughout the entire data integration process, encompassing preprocessing, data loading, and post-processing task.

Yet, this merely scratches the surface. The Pipeline introduces a multitude of potent benefits and functionalities. We’re delving into an in-depth exploration of this novel feature to uncover its potential in revolutionizing your data integration process.

Enterprise data within each application is grouped as multiple dimensions. Each Dimension has its own Data chain. Registering New application results in the creation of various objects and associated dimensions. An A

Note the following Pipeline considerations

  • Only administrators can create and run a Pipeline definition.
  • Pipeline is a replacement for the batch functionality in Data Management and can be migrated automatically to the Pipeline feature in Data Integration.
  • For file-based integrations to a remote server in the Pipeline when a file name is specified in the pipeline job parameters., the system copies any files automatically from the local host to the remote server automatically under the same directory.

This function applies to the following Oracle solutions:

  • Financial Consolidation and Close
  • Enterprise Profitability and Cost Management
  • Planning
  • Planning Modules
  • Tax Reporting


Proof of Concept

  EPM batches to run sequentially are:

Stage 1 – Load Metadata
  1. Load Account Dimension
  2. Load Entity Dimension
  3. Load Custom Dimension
  4. Clear current month Actuals (to remove any nonsense numbers if any)
Stage 2 – Load Data
  1. Load Trial balance from Source
Stage 3 – Run Business Rule
  1. Run Business rule to perform Aggregate & Calculations.


The workflow for creating and running a Pipeline process is as follows:

  1. Defining Pipeline

  1. Pipeline Name, Pipeline Code, maximum Parallel Jobs
  2. Variable page to set the out-of-box (global values) for Pipeline are available from which you can set parameters at runtime. Variables can be pre-defined types like: “Period”, “Import Mode” etc.

 

  1. You can utilize Stages in the Pipeline editor to cluster similar or interdependent Jobs from various applications together within a single unified interface. Administrators can efficiently establish a comprehensive end-to-end automation routine, ready to be executed on demand as part of the closing process.

Pipeline Stages & Container for multiple jobs as shown below:

Stages & Jobs example

 

The new stages can be added by simply using the Plus card located at the end of the current card sequence.

 

  1. On the Run Pipeline page, Complete the variable runtime prompts and then click As shown below:

 

 

Variable Prompts

 

When the Pipeline is running, you can click the status icon to download the log. Customers can also see the status of the Pipeline in Process Details. Each individual job in the Pipeline is submitted separately and creates a separate job log in Process Details.

Users can also schedule the Pipeline with the help of Job Scheduler.

Variable Prompt

Review

 


Amir Kalawant

Oracle Fusion Cloud EPM – 23.08 Update

EPM Cloud August Update

Test Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 4, 2023.


Production Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, August 18, 2023.


RELEASE HIGHLIGHTS


HELPFUL INFORMATION

The Oracle Help Center provides access to updated documentation. The updates will be available in the Help Center on Friday, August 4, 2023.

NOTE: Some of the links to new feature documentation included in this readiness document will not work until after the Oracle Help Center update is complete.

Updated documentation is published on the Oracle Help Center on the first Friday of each month, coinciding with the monthly updates to Test environments. Because there is a one-week lag between the publishing of the readiness documents (What’s New and New Feature Summary) and Oracle Help Center updates, some links included in the readiness documents will not work until the Oracle Help Center update is complete.

https://docs.oracle.com/en/cloud/saas/epm-cloud/index.html


FIXED ISSUES AND CONSIDERATIONS

Software issues addressed each month and considerations are posted to a knowledge article on My Oracle Support. Click here to review. You must have a My Oracle Support login to access the article.

NOTE: Fixed issues for EPM Cloud Common components (Smart View for Office, EPM Automate, REST API, Migration, Access Control, Data Management/Data Integration, Reports, Financial Reporting, and Calculation Manager) are available in a separate document on the My Oracle Support “Release Highlights” page.

The full Oracle advisory note can be found here

Enterprise Data Management Series – Part 2

In the first part, we learned an overview of EDMCS. In this part, we will discuss more on “What EDMCS has to offer and How”. Unlike DRM where we have version, hierarchy, and nodes; Oracle has introduced View, Viewpoint, and Data chain. Let us go through the basic structure of EDMCS.

 

Figure 1 EDM Model

Figure 1 EDM Model

 

Enterprise data within each application is grouped as multiple dimensions. Each Dimension has its own Data chain. Registering New application results in the creation of various objects and associated dimensions. An Application consists of connected views, dimensions, and associated viewpoints:

  • The View is a collection of Viewpoints.
  • Viewpoints are where users view and work with application data.
  • Each dimension contains a series of related data objects called data chains, which consist of node types, hierarchy sets, node sets, and viewpoints.

 

The above objects are the building blocks of the EDMCS as shown and explained below.

 

Information Model

Figure 2 Information Model

 

Application

  • An application models each connected system as an application. You can click on Register to create a new application.

 

Application

Figure 3 Application

 

Dimension

  • Enterprise data is grouped as dimensions such as Account, Entity, and Movement.

Figure 4 Dimension

 

Figure 5 Data Chain Flow

 

Node Type

  • Collection of nodes that share a common business purpose, like Department, Entities.
  • Defines Property for Associated nodes. For Example, Product node type can include properties like Name, Description, Cost, etc.

Figure 6 Node Type

 

Hierarchy Set

  • The hierarchy set defines parent-child relationships for nodes. Example Employees to Department or Vehicles rollup to Automobiles etc.
  • It can define own hierarchy sets using different relationships between node types.

Figure 7 Hierarchy Set

 

Node Set

  • Defines a group of nodes available in Viewpoints and consists of hierarchies or lists. Example Hierarchy of Cost Centre or List of Country codes.
  • Node sets are the only group of hierarchy sets that are required in Viewpoints. Consider the below figure where only Marketing and Finance are included, and the Marketing hierarchy excluded.

Figure 8 Node Set

 

Viewpoint

  • Viewpoints are used for managing data like comparing, sharing/mapping, and maintaining a dimension across applications such as viewing a list of accounts or managing a product hierarchy or exporting an entity structure.
  • Viewpoints are organized into one or more views. Each viewpoint uses a node-set and controls how users work with data in that node-set in a specific view.

Figure 9 Viewpoint

 

 View

  • A group of viewpoints such as managing data for a dimension across applications or integrating data from and to an external system.
  • Users can define additional views of their own to view and manage data for specific business purposes.

 

Figure 10 View Dashboard

 

 

Integration Benefits

Oracle has taken a major leap improving Integration in EDMCS. When in DRM, Integration to other Hyperion modules can only be possible through Table, Flat file, or API integration or involving custom code development. EDMCS has introduced various components Adapter like PBCS, FCCS, EBS to help make a connection directly to the respective component. Note: Adapter for some components is yet to be deployed from Oracle. However, you can always integrate using the standard flat file export.

 

Migration made simple

Existing on-premise Data Relationship Management can be migrated to EDMCS. The administrator needs to register DRM application in EDMCS as custom application and then import dimensional structure. Note: Data Relationship Management 11.1.2.4.330 or higher is supported for on-premise to cloud migration.

 

Governance at a different level

Previously, on-premise DRM had a separate Data Relationship Governance (DRG) interface but in EDMCS it included governance as part of an application. In EDMCS, organizations use request workflows to exercise positive control over the processes and methods used by their data stewards and data custodians to create and maintain high-quality enterprise data assets. Workflow stages are similar like Submit, Approve, and Commit. Finally, before committing changes, users can visualize changes and their effect on Hierarchy.

 

Oracle Narrative Reporting – Part 2

Overview of the Report Package

  • EPRCS operates with the “Reporting Package” feature that provides the ability to merge with Microsoft Office data and documents. EPRCS can also be combined with on-premise software, cloud data sources or other Oracle EPM applications.
  • Report packages provide a secure, collaborative, and process-driven approach for defining, authoring, reviewing and publishing financial and management reports.
  • With report packages, one can organize the content, delegate roles to authors and reviewers, manage their collaboration/workflow approvals, and sign-off process to create a structured document.

 

Figure 1 Report Package Features

 

How to create a Report Package

While creating a report package we need to provide the following details:

  • Enter Properties
    In the properties section, we need to provide the Name, Description, Report Type, Style Sample and Save To fields of a report package.

Figure 2 Enter Properties

 

Define Process

Apply the respective development phases and define the timeline for each phase.

 

  • Author Phase
    Once this phase is enabled, click on the Calendar icons to define the following dates: Start Author Phase On, Submit Doclets By and End Author Phase On.

 

Figure 3 Define Process: Author Phase

 

  • Review Phase
    Once the “Review Phase” has been enabled, click on the Calendar icons to define the dates: Start Review On, End Review Cycle 1 On and End Review On.

 

Figure 4 Define Process: Review Phase

 

  • Sign-Off Phase
    Once the “Sign-Off Phase” has been enabled, click on the Calendar icons to define the following dates: Start Sign Off On and End Sign Off On.

Figure 5 Define Process: Sign-Off Phase

 

  • Assign Users
    Next, we need to assign users and groups to the following report package responsibilities such as Owners, Reviewers, Signers, and Viewers.

 

Figure 6 Assign Users

 

  • Define Options
    The last step is to define the options for a report package such as Format Options, Shared Folder, and Doclet Versions.

 

Figure 7 Define Options

 

At last click on “Finish” to complete the report package setup. Next, we will discuss the workflow process.

Collaboration and Workflow of Report Packages

The three phases that a report package includes are the Author, Review and Sign-Off phases. For a report package, one or more of the phases can be selected.

  • Author Phase
    Within this phase content, comments and supporting details are updated to help collaborate with other users. It can be applied to an entire report package, a section, or individual doclets.

 

Figure 8 Author Phase

 

  • Review Phase
    The Review Phase is a review cycle where the reviewers can view the current status of the Doclets and input comments on the drafts through the commentary feature if needed and eventually mark their view as complete.

Figure 9 Review Phase

 

  • Sign-Off Phase
    In the sign-off phase, anyone designated as a signer for that review package formally reviews the fully completed report package one final time and other sign-off or rejects it.

Figure 10 Sign-Off Phase

In the next blog, we will discuss integration/extension of EPRCS with Microsoft office products.

 

Oracle Data Integrator Cloud Service (ODICS) – PART 2

Oracle is one of the prominent leaders in providing comprehensive data integration solutions that includes Oracle Data Integrator Cloud Service (ODICS), Oracle Data Integration Platform Cloud, Oracle Golden Gate, Oracle Enterprise Data Quality, Oracle Enterprise Metadata Management, and Oracle Stream Analytics.  ODICS provides continuous access to timely, reliable and heterogeneous data from both on-site and cloud solutions to support analytical and operational business needs.

 

New key investment areas ensure Oracle Data Integrator Cloud Services continues to support clients during their business growth and transformation process. ODICS introduces new functionality in the following areas:

 

Oracle Object Storage and Oracle Object Storage Classic

  • Oracle Object Storage and Object Storage Classic provide fast, stable and secure cloud storage and now ODICS will integrate Oracle Cloud Infrastructure (OCI) seamlessly with them.
  • ODICS comes with a collection of Knowledge Modules (KMs) that can be used to link to Oracle Object Storage and Object Storage Classic in Mappings and Packages to manage files within the local archive or the Hadoop Distributed File System (HDFS).

 

ODI Object Storage

Figure 1 ODI Object Storage

 

Autonomous Databases

  • ODICS now comes with optimized Loading and Integration Knowledge Modules (KMs) that are certified with Oracle Autonomous databases such as:
    • Oracle Autonomous Data Warehouse Cloud (ADW)
    • Oracle Autonomous Transaction Processing (ATP)
  • ODICS works easily with ADW and ATP to achieve better performance in a fully managed environment that is configured for specific workloads by integrating with Autonomous Databases.
  • Both ADW and ATP use the same set of Knowledge Modules and utilize the updated native integration of Oracle Object Storage and Oracle Object Storage Classic.
  • Additionally, Oracle Data Integrator users can also use native integration between Oracle Autonomous Data Warehouse and Oracle Object Storage to allow fast data transmission to ADW or ATP and simplify the entire loading process.

 

ODI Autonomous Data Warehouse

Figure 2 ODI Autonomous Data Warehouse

 

 

Oracle Enterprise Resource Planning (ERP) Cloud

  • The new release also provides a new Infrastructure and Software Platform for Oracle Enterprise Resource Planning (ERP) Cloud, a suite of cloud apps for accounting, project management, sourcing, risk management, and operations.
  • ODICS works seamlessly into the Oracle Enterprise Resource Management (ERP) platform which allows companies to incorporate their ERP data into their data warehouses, data marts. The native application also lets ODICS customers load data into Oracle’s ERP Cloud.

 

ODICS ERP

Figure 3 ODICS ERP

 

 

In the next post, we will discuss more key features such as Oracle Sales Cloud, Oracle Service Cloud, GIT Offline Support, and SAP Delta Extraction.

 

 

Oracle Data Integrator Cloud Service (ODICS) – PART 1

Oracle is one of the prominent leaders in providing comprehensive data integration solutions that includes Oracle Data Integrator Cloud Service (ODICS), Oracle Data Integration Platform Cloud, Oracle Golden Gate, Oracle Enterprise Data Quality, Oracle Enterprise Metadata Management, and Oracle Stream Analytics.  ODICS provides continuous access to timely, reliable and heterogeneous data from both on-site and cloud solutions to support analytical and operational business needs.

ODICS Overview:

  • ODICS provides high-performance data transformation capabilities with its transparent E-LT architecture and extended support for cloud and big data applications.
  • ODICS supports all the features included in Oracle Data Integrator Enterprise Edition within its’ heterogeneous cloud service.
  • ODICS provides an easy-to-use interface to improve productivity, reduce development costs and decrease the total cost of ownership.
  • Oracle Data Integrator Cloud Platform is fully integrated with Oracle Process as a Service (PaaS) platform, such as Oracle Database Cloud Service, Oracle Database Exadata Cloud Service and/or Oracle Big Data Cloud Service to deliver data needs.
  • ODICS can work with third-party systems as well as Oracle solutions as shown in the below screenshot.

ODI On-Premises Integration with Cloud Services

ODI On-Premises Integration with Cloud Services

 

Cloud E-LT Architecture for High Performance vs Traditional ETL Approach:

  •  Traditional ETL software is based on proprietary engines that execute row by row data transformations, thus limiting performance.
  • We can execute data transformations on the target server by implementing an E-LT architecture based on your existing RDBMS engines and SQL.
  • The E-LT architecture gathers data from different sources, loads into the target and performs transformations using the database power.
  • While utilizing existing environment data infrastructures, Oracle Data Integrator delivers flexibility by using target server for data transformations thereby minimizing network traffic.
  • The new E-LT architecture ensures the highest performance possible.

ODICS ELT vs ETL Architecture Differences

ODICS ELT vs ETL Architecture Differences

 

Oracle Data Integrator Architecture Components:

The Oracle Data Integrator (ODI) architecture components include the below feature sets.

 

ODI SDK Java-based API for run time and scheduling Operations.
ODI Studio Designers’ studio to manage connections, interface designs, development, and automation including scheduling.
ODI Standalone Agent It can be configured in a standalone domain and managed by WebLogic Management Framework.
ODI J2EE:

 

This is the Java EE agent based on the Java EE framework that runs on a WebLogic domain a Managed Server configured in a WebLogic domain. This feature set only comes with Enterprise Installation.
ODI Standalone Agent Template Domain files that are required when Oracle WebLogic Server is not handling your Oracle Data Integrator installation. This feature set is accessible only with the type of Standalone Install.
ODI Console As an alternative to certain features of ODI Studio, we can access the web-based console available to assigned users.
FMW Upgrade This is the upgrade assistant used to upgrade the Oracle Data Integrator version from 11g to 12c.
Repository Creation Utility The Repository Creation Utility (RCU) is used to create database schemas and included with the Standalone Installation type. Enterprise Installation does not include RCU but RCU is included with the installation of Oracle Fusion Middleware infrastructure distribution.

 

ODICS Architecture

ODICS Architecture

 

New / Enhanced Big Data and Cloud Features within ODICS:

 ODICS continues to evolve with technological advancements for Big Data and Cloud Knowledge Modules for better transformations.

Big Data Features:

  • Spark Knowledge Modules (KM) Improvement: The emphasis was on producing high-performance, and easy-to-read code (Spark) instead of handwritten scripts. Spark KMs now leverage the latest features such as Dataframes from Apache Spark 2.x to speed up the ODI processes.
  • Spark KMs support in Knowledge Module Editor: The Spark KMs are now fully supported and can be customized as per specific needs.
  • Hadoop Complex Types Enhancements: ODI enhances its support capability to Apache HDFS and Kafka Architecture.
  • Big Data Configuration Wizard: The Big Data Configuration Wizard is now updated with new templates for the current Cloudera distribution.

Spark KMs In Knowledge Module Editor

Spark KMs In Knowledge Module Editor

 

Cloud Features:

  • RESTful Service Support: ODICS can invoke RESTful Service in Topology configurations that include RESTful Service connectivity, resource URI, methods, and parameters.
  • Business Intelligence Cloud Service (BICS) Knowledge Modules: BICS is now supported out of the box in ODICS.
  •  Connectivity with Salesforce: ODICS is fully certified with Salesforce.com and now includes a JDBC driver for this technology out of the box.

ODI Integration With Salesforce

ODI Integration With Salesforce

 

In the next part, we will focus on more key feature highlights within ODICS.

 

Introduction to Oracle Tax Reporting Cloud

 

 

Tax reporting (from Oracle) is an incredible application to increase the efficiency of the tax function.

With the rise of digital economies, governments across the world are finding ways to tax the digital income generated from their country. OECD is working on details of the digital tax so that companies and governments can work more efficiently. Meanwhile, France/UK are ready to tax digital companies with their own digital taxes.

In this world of uncertainty, governments are looking to increase their tax revenue and making changes to the tax laws accordingly. It is expected that companies should calculate their tax obligation correctly as per the latest tax codes in that jurisdiction.

However, tax functions in many companies are still using Microsoft spreadsheets to prepare tax calculations. These calculations might be prepared at an entity level in a spreadsheet and then sent to a regional or global the tax function using email. Tax experts review tax calculations at group level and if there are issues with the tax calculation of an entity or formula errors, the spreadsheet is sent back to the local team for correcting or updating. There is so much to and fro of the spreadsheets that tax and finance teams can easily lose track of the versions and corrections. Sometimes tax calculation models refer to many linked spreadsheets which creates further complexity when the spreadsheets need to be updated for a new account, changes in legislation or accounting standards. Formula errors can exist in the spreadsheets that are difficult to identify and correct. The other issue is that the tax models may be maintained and updated by one team member. If that team member leaves the company, there is a big risk to the tax process and financial close. These are a few of the pain points, bottlenecks and risks with using spreadsheets to prepare tax calculations.

Tax reporting is a web-based cloud solution, which has inbuilt functionality for:

  • Configurable tax calculation rules
  • Automatic calculation of tax expense, DTA/DTL
  • Approval process
  • Roll forwarding of tax accounts
  • Loading trial balance data
  • Load fixed assets data
  • Currency translations
  • Consolidation
  • Calculate Effective Tax Rate
  • Reports on local/ regional /state / national tax data
  • Produce tax accounting journal entries
  • Country by country reporting
  • Capture supplemental data for tax calculations and additional disclosure
  • Maintained by the tax and finance users

 

 

The tax reporting solution provides tax departments with the ability to meet global tax reporting requirements on an ongoing basis and ensure compliance with changing tax regulations.

We can help with implementing Oracle’s tax reporting solution for your organization and provide guidance on how to get the maximum value out of it.

Oracle Narrative Reporting – Part 1

In this blog post, we will be focusing on EPRCS Security access rights and roles available.

EPRCS Overview

  • Oracle Enterprise Performance Reporting Cloud Service (EPRCS) is a Cloud solution for management and narrative reporting. It provides a secure and integrated solution that offers a collaborative and process-driven approach. Cloud maintenance, patching, and back-ups are managed by Oracle.
  • The workflow process provides collaboration, commentary, and delivery of management reporting through EPRCS objects which are stored in a Library. The library can be organized by folder and managed through security.
  • EPRCS allows users to easily combine both data and narrative content on report objects called Doclets. Doclets are grouped together in a report package. “Doclet” is used to perform check-in and check-out process to manage versions.
  • Extended Microsoft Office tools (Word, PowerPoint, etc.) can be used to provide the output of management reports. This includes an intelligent and intuitive simplified UI that can be accessed via desktop, mobile, and the web.
  • User Roles: Provides user types such as owners, authors, and approvers and role-based security and auditable access on desktop and mobile devices.

Figure 1 EPRCS Architecture

 

 

EPRCS Security

EPRCS enables secure collaboration between users. One can control which users can edit which Doclets. This allows users from various departments and areas to all contribute to the same report package thereby safeguarding sensitive data. Inside EPRCS, security is provided at three levels:

 

Figure 2 EPRCS Security Roles

System-Level Roles:

  • For EPRCS environments two sets of roles are created. One set is for Production and another set is for Pre-Production. Pre-Production allows EPRCS customers to keep security differentiated for testing purposes.
  • Roles can also be combined into groups under My Services via the Custom Roles tab. It is considered best practice for assigning security to users based on groups, rather than individually. The five predefined roles are as below:
Roles Access based on roles
Service Administrators Create and maintain all aspects of the system, except for user management
Reports Administrators Create and manage report packages, management reporting definitions, and Disclosure Management documents
Application Administrators Create and maintain all application artifact’s, such as applications, models, dimensions, and data grants
Library Administrators Create and manage folders including root-level folders
Users The minimum role required to log-in and participate in the reporting cycle, and to view artifacts’ to which the user has access

 

 

Artifact-level Security:

Figure 3 Artifact level Security

 

 

You automatically have permission to edit, delete, and maintain that artifact when you create an artifact (report package, folder, application). You can grant security by users and groups to the artifacts created. Users without access cannot see or access that artifact. Artifacts can be given the following forms of permissions:

 

Roles Application Report Packages Third-party Artifacts or Folders
Administer Y Y Y
Use Y
View  – Y Y
Write  – Y

 

  • Administer: Unrestricted view and change privilege to all artifacts.
  • Write: Enables users to add folder content.
  • View: Enables users to view only the artifact.
  • Use: Enables the user to see the application in the library.

 

Figure 4 Access in Report Package

Figure 5 Access to Application


Data Security:

 

Figure 6 Data level Security

It determines the level of security, in which data access permissions can be granted to users. Data level security can be set through Dimension-based access: either by setting the READ access or NONE access.

Figure 7 Dimension-based access

 

Grant access to parts of data in a model. This grant may be at an individual level or by cross-dimension/intersections.

Figure 8 Data Grant level access

To Summarize:

  • EPRCS is a powerful reporting solution that is secure, collaborative and intuitive meant to complement and combine reporting from various types of technologies.
  • Simplify the report creation and distribution process.
  • Collaborate with content contributors and reviewers.
  • Access through Mobile or desktop – when you want, how you want.
  • Publish book – quality financial and management reports.
  • EPRCS has a wide variety of functions that can take it anywhere in an organization to meet unique reporting needs with its cost-effective pricing and is a great entry point for new cloud customers.

 

 

Part 2 will be on “EPRCS Workflow Setup and Details”: where we will talk about the workflow approvals and sign-off process related to the publishing of the reports.

Enterprise Data Management Series – Part 1

Welcome to our initial post in a series of about the world of Metadata Management, Enterprise Data management provides a new way to manage your data assets. You can manage application artefacts that include properties such as Master data (members that represent data and includes dimensions, hierarchies), Reference Data (such as page drop-downs for ease of filtering in frontend), and Mappings (master data member relationships). Using these pre-built functions, you will be able to track master data changes with ease.

If your business is restructured to align entities such as Accounts, Products, Cost Center and Sales across multiple organizational units, you can create model scenarios, rationalize multiple systems, compare within system etc. You can maintain alternate hierarchies for reporting structures which differ from current ERP system structure. In the case of migrating an application to the cloud, you can define target structures and data mappings to accelerate the process. EDMCS also provides the ability to sync up applications from on-premise to cloud or across cloud applications. The process involves the below four C’s:

  • Collate: The first process involves collecting and combining data sources through the process of application registration, data import, and automation.
    • The registration process refers to establishing application connections with a new data source(s).
    • The Import process refers to the loading of data into the application view(s).
    • Automation is built using the REST API features.

Figure 1 Collate Process: Register

 

Figure 2 Collate Process: Import

 

  • Curate: Curate helps organize the source data through a Request mechanism to help load or change existing data. Request mechanism involves 4 steps: Record, Visualize, Validate and Submit.
    • Record: All appropriate actions are Recorded together under the Request Pane with primary action mentioned first.
    • Visualize: This allows you to visualize all the changes prior to committing them. This enables us to view model changes as per request, then study the impact and make any final modifications to the request. Changes made to a view are visible in unique colours and icons so that you can see which parts of the hierarchy or list were changed and what areas may be affected by the change.
    • Validate: Maintain the integrity of data during data entry with real-time validation that checks for duplicates, shared nodes, data type or lookup violation. You can run validation through the Record session as well.
    • Submit: Upon user validations, you can commit them.

Figure 3 Request Process

 

  • Conform: Conform is the process to help standardize rules and policies through Validation, Compare and Map process tasks.
    • You can run validations between and across viewpoints. Share application data within and across the application to create consistency and build alignment.
    • Compare viewpoints side-by-side and synchronize using simple drag and drop between and across applications.
    • Map nodes across viewpoints to build data maps. Construct an alternate hierarchy across viewpoints using the copy dimension feature. Now the question arises “What is the viewpoint”? Viewpoint is a subset of the nodes for you to work with.

 

Figure 4 Viewpoint and Various Actions

 

  • Consume: Consume defines where you move the changes to a target application. You can achieve this by download, export, and automate.
    • Download a viewpoint to make changes offline or make bulk updates.
    • Export moves changes to other application(s) or sync updated data to external target applications.
    • Using EPM automation, you can load and extract dimensions from and within the EPM cloud application

 

Figure 5 EPMAutomate