Account Reconciliation

Get Involved. Join the Conversation.


    Tony Scalese
    EPM Integration Agent
    Topic posted September 18, 2019 by Tony ScaleseBronze Trophy: 5,000+ Points, tagged Data load 
    446 Views, 14 Comments
    EPM Integration Agent
    EPM Integration Agent Testing Impressions

    For those that have followed my blogs over the years, you likely know that whenever possible I like to weave a story into my posts. Let’s face it, integration can be a little dry so I like to bring a bit of the human element. So let’s start this post with a bit of a back story. A few months ago, I blogged about OPA – the planned On-Premises Agent – which would allow Cloud Data Management to source data from on-premises systems (EPM and non-EPM alike) using just a lightweight JAVA based utility that you would deploy within your company’s network.

    I thought OPA was a fun, catchy name. Now I’m not a marketing guru but wouldn’t Oracle want to associate their capability with an expression that means excitement? Short answer, no. “Stop calling it that Tony, we’re branding it EPM Integration Agent.” Ok, fine, but you missed a golden opportunity to call it Cloud Integration Agent or CIA. On second thought, you probably don’t want your Enterprise software associated with one of the most prolific spying agencies on the planet. Fine, we’ll go with EPM Integration Agent but I’m not calling it EIA because that sounds too much like EIEIO and I’m not (that) old and I certainly don’t have a farm. Ok, storytelling time is over, let’s get into the details.

    I have had the opportunity to beta test the EPM Integration Agent over the past couple of months. Let’s get straight to the punch line, it works and it works well!  I am incredibly excited about the possibilities that this opens up not only in the very near future but also in the long term. With the agent, Oracle has laid the groundwork for a pure cloud environment including data integration. Now let’s talk details.

    I got my first hands-on test with the agent soon after the 19.06 update dropped. I ran through several different scenarios and uncovered only a few bugs. I remember thinking how impressed I was that a beta product was so mature. Fast forward a couple of months. Oracle has updated some of the code and integrated agent functions into the front end of the EPM Cloud platform and asked for an additional round of testing. I wrapped up that testing last week and want to share my impressions and key functionality that is coming soon.

    What is the EPM Integration Agent?

    The EPM Integration Agent is a lightweight JAVA utility that is implemented within your company’s network. It enables communication between the Oracle EPM Cloud and your company’s data – notice I did not say systems or on-premises, more on this later. It is a component that can be used by Cloud Data Management and Data Integration (found under Data Exchange). As a high level, the agent accepts requests from the Oracle EPM Cloud, retrieves your data  that lives somewhere other than Oracle EPM Cloud and sends that data to the cloud for processing. It’s incredibly simple but also very powerful.

    How Does The Cloud Talk to On-Premises?

    The agent has two methods that can be used to enable communication between your on-premises data and the Oracle EPM Cloud – SYNC and ASYNC. Technical folks hear these terms and generally think of synchronous (serial) and asynchronous (parallel) but in the context of the agent, they mean something entirely different. These methods refer to the communication mechanism that the agent utilizes.

    ASYNC Method

    When utilizing the ASYNC method, the agent is polling the cloud on an interval (determined by you) and looking for queued requests. Think of it like this, the agent picks up the telephone on a secured line and calls the Cloud. Hi EPM Cloud, this is the EPM integration Agent, do you need me to do any work right now? No, ok and the agent hangs up. And the process repeats every X minutes until the agent gets a job or jobs from the cloud.

    In async mode, a Cloud Data Management job that utilizes the agent is executed and the job goes into a queue waiting for the agent to poll the cloud and look for jobs. If you are running the job interactively, this would appear as the import step processing until the agent picks up the request and returns data to the cloud. If you are going to utilize the async method, consider this when setting the polling time parameters.

    SYNC Method

    The SYNC method handles requests differently. Going back to the telephone example, the Oracle EPM Cloud picks up the phone, calls the Agent and says, I’ve got a job for you. The agent says, Got it and the call ends.

    With SYNC, there is no queuing of jobs, each job is immediately sent to the agent and processed.

    SYNC versus ASYNC – Which do I use?

    One might wonder, why would I ever want to use the ASYNC method, SYNC sounds much more efficient. The difference between these two methods is security. The ASYNC method does not require any web addresses or ports to be opened to the internet. The agent simply communicates over port 443 (https – secured) between the agent and the Oracle EPM Cloud.

    SYNC in contrast requires at least one externally facing IP to be open to the Oracle EPM Cloud. Now before you shut that down, there are safeguards that the agent provides to prevent malicious actions. I’ll describe the setup I used in my testing as an example.

    We have a sample EBS deployment that runs within our lab environment. This image is not open to the external world, it can only be hit from within our lab environment. The environment is so locked down that I can’t even access the EBS login page when connected to our VPN, I have to be physically logged into a server within the domain in which the image exists. Now that may sound odd but remember, I am a consultant and a remote employee.  We don’t have a global domain that I connect to every day. Our lab environments that are isolated in specific domains.

    All that said, we have an externally facing web server that accepts communication on port 19000. That web server can communicate with the EBS server. So if we look at the capabilities of the agent, we can have the agent running on one server (EBS application server) while accepting requests from another server (web server). With a little configuration on the web server, we can redirect the request received by the web server to the application server.

    Thankfully, I have some of the best infrastructure people on the planet in my organization because I wouldn’t even know where to begin with this. A special thank you to Leonard Wood and Tarun for their tireless efforts to get this working so we could continue beta testing for Oracle. As they explained it to me, a reverse proxy was implemented on the web server to redirect all traffic from the web server on a specific port to the application server agent port.  A reverse proxy can act as a gateway service allowing access to servers on your trusted network (where the EPM Integration Agent is installed) from an external network.

    The following was added to the https.conf file on the web server. Bear in mind, these are not the actual entries but they illustrate how the reverse proxy is configured.

    Next, the agent configuration was updated in the EPM Cloud. The web URL field denotes the web address to which Cloud Data Management will send requests. The physical URL represents the IP address and port to which the web url will redirect the request through the reverse proxy. I think of this as akin to the idea of ODI topology where you have a logical (web URL) and a physical (physical URL) web address. The cloud interacts with the logical and the agent works with the physical. The reverse proxy is like the ODI context that functions as a map between the logical and the physical.

    The web URL can be modified once the agent is started. There is a bug in the pre-GA release that sets the web URL to the physical URL whenever the agent is restarted but I have raised that to Oracle and expect it will be fixed when the EPM Integration Agent is in generally available (GA) release.

    How Does It Work?

    Now that we understand a little bit of the architecture, let’s talk about functionality. As I mentioned early on in this post, the agent allows the Oracle EPM Cloud to interact with data that is outside of the Oracle EPM Cloud. When you configure the agent, you define a relational database SQL query that should be executed by the agent against an on-premises database. The query allows for bind variables which can be viewed as essentially run-time substitution variables. When the agent receives a request (SYNC or ASYNC), it executes the query defined for the integration against the local RDBMS, creates a flat-file locally, uploads the flat-file to the EPM Cloud and hands processing back to Cloud Data Management (or Integration).

    This data flow is important to understand for two reasons. First, all data loaded to the Oracle EPM Cloud is loaded as a flat-file. This is a source of tremendous confusion in the industry. FDMEE, EPM Automate, and the REST API all send data to the cloud in the form of a flat-file. There is no streaming of data, I repeat, all data loaded to the cloud is uploaded as a flat-file. Now for the folks that like to challenge definitive statements such as what I just made, yes I acknowledge that data can be directly input into the model through web input forms and SmartView inputs but I would hardly consider either of those integration strategies and as such outside the scope of my statement.

    The second reason why it is important to understand the data flow for inbound integration to Oracle EPM Cloud is because it directly relates to the capability of the EPM Integration Agent. If we investigate the agent steps:

    1. The agent receives a data extract request
    2. The agent performs an action and produces a flat-file
    3. The agent uploads the flat-file and hands processing back to Data Management

    By default, yes we expect the agent to query a database and output the results to a flat-file. Well, here’s the magic folks. We can “trick” the agent. Instead of executing a query against an RDBMS, the agent can perform other actions such as a file move and rename. The agent has one requirement, the name of the data file to be uploaded must follow a required naming convention (number.dat where number is the process id of the execution). So we can tell the agent

    • Skip the query execution
    • Sweep some local network directory for a file
    • Copy the file over to the agent data folder and rename it appropriately

    In doing so, the agent supplies data back to the EPM Cloud and the cloud is no the wiser for where the data came. So in laymen’s terms, the EPM Integration Agent will allow data to be sourced from relational (RDBMS) sources as well as any other source the can provide a flat file output. The possibilities are limited only by the source system and your scripting ability. Want to source data from a cloud application like SalesForce? No problem, write a script that does a data extract using SalesForce’s REST API, output the results to a flat file and pass it back to the agent. Want to source data from SAP ECC? No problem, write a command line interface that executes an SAP ABAP extract, sweep and rename the file and pass it back to the agent.  Starting to see the possibilities?

    Agent Scripts – How the Magic Happens

    The agent provides the ability to execute scripts at four stages of the agent’s processing:

    • Before the data extract
    • After the data extract
    • Before the file upload to Oracle EPM Cloud
    • After the file upload to Oracle EPM Cloud

    These event based scripts are what allow us to extend the capability of the EPM Integration Agent beyond just local relational data sources. Let’s walk through a basic need – we want to load data from SalesForce on demand. In this example we would do the following:

    1. Create a dummy extract query in Cloud Data Management to simply enable the integration to the agent
    2. Create a before data extract event script that:
      1. Cancels the extract query
      2. Invokes a data extract from SalesForce using the SalesForce REST API
      3. Creates a data file in the agent folder in the naming convention required by the agent

    That’s literally it. The presence of the file allows the agent to continue as though the dummy query was executed against SalesForce.

    Event scripts for the agent can be written in Jython (using the latest version of Jython!) or Groovy. For those familiar with FDMEE scripting, Jython and how you interact with the EPM Agent context (same concept as the FDMEE context) is identical. For those Groovy fans, there may be a slight learning curve for how to use what is known as the context but nothing too steep.  The context is simply a hash map of run time values related to the agent execution.

    Now the real thing that excites me is that the agent script capability lays the foundation for scripting to be available in Oracle EPM Cloud. By having a framework that invokes on-premises scripts, you could eventually have event based or even custom scripting in the EPM Cloud that invokes the agent, runs the script locally and communicates the results back to the EPM Cloud. I have discusses this with Product Management and Development and outlined a few use cases. They are receptive so please feel free to comment below to share ideas you may have for Oracle to consider.

    When Can I Have It?

    I checked in with Oracle Development today (9 Sep 2019) and asked about timing for GA. This wouldn’t be a proper EPM blog post if I didn’t refer you to the Oracle Safe Harbor statement. With that out of the way, the EPM Integration Agent is planned for the 19.10 release! If you aren’t familiar with the naming convention of the EPM Cloud releases, take a quick stroll over to my recent post and learn all about it. October should be an exciting month!

    Wrapping Up

    Well, this was a long post but this is a significant step forward and worth the time. Hopefully you found it worth the read. I am incredibly excited about the future of Cloud Data Management and its ability to interact with on-premises and non-Oracle EPM Cloud systems. I plan to update the white paper I authored in December 2017 to reflect not only this new capability but also another emerging option in the Oracle EPM Cloud data integration landscape. At Alithya, we recently signed a partnership agreement with OneCloud because we believe they have a very attractive offering in the data integration space that can truly enhance the value of Oracle EPM Cloud. I hope to have the updated analysis in the coming couple of months. Stay tuned.




    • Tim Gaumont

      Just a confirmation for everyone, this is planned to be included in 19.10 so it'll be on Test sites starting this Friday, October 4th.

    • Adrian Ward

      Great post Tony, one question.

      I've been reading the notes on this ( and it mentions the new Data Integration module.

      Does this need to be installed to use EPM Integration Agent or can it be used with the standard Data Management module?



    • Tony Scalese

      Yes EPM Integration agent can be used with the Data Management module.  Data Integration is basically data management with the new SUI.  When I did my testing, I used classic Data Management and it was successful.  

    • Adrian Ward

      Hi Tony,

      May be a silly question but.... were you testing this using ARCS?



      • Tony Scalese

        Not a silly question, actually a very interesting one.  I did not explicitly test with ARCS.

        Given how ARCS integration works (pull from CDM), it's an interesting problem.  You need to specify the file name in the data load rule of CDM in order for ARCS to properly execute the DLR and pull data.  The integration agent expects the file name to be ProcessID.dat where process ID is the execution number.  Since that obviously increments every time, it's a unique problem to solve.  You will need to use the AftUpload event script to rename the dat file to match the expected file name format specified in the data load rule of Data Management.  

        • Thejas Shetty

          You don't need to do any of that scripting. Please note that, using the Integration agent, the DM load is not a "File" based load. It is instead a "Data Source" based load. (although a text file is involved behind the scenes). The DM will process the file, based on the current Load ID & stage it to ARCS reconciliations. There is nothing you need to do in ARCS or externally to orchestrate this.

          • Adrian Ward

            Ok, here is another twist to the tale.

            Can we use Oracle Integration Cloud to connect EPM Integration Agent to our On-Premise EBS server or do we need to install something like IBM APP Connect (BlueMix) to ensure secure connection / transfer of the data files?

            • Tony Scalese

              The Agent has a prebuilt connection to on-prem EBS general ledger.  If you need to connect to the subledgers, then you can still use the agent directly but you need to write the SQL extract query and define it within the Data Management setup.  There is no need for IBM or any other system.  The agent using a JDBC connection to extract data from EBS and produce a flat file.  The agent also transfers that flat file using the REST API to Oracle Cloud

    • Sachin Sawant

      Hi Tony,

      We were going through EIA Agent document which mentions that it will be available as a preview version for Planning and Financial Consolidation and Close system administrator users only.

      We wanted to check for ARCS, do have any idea on if its available with ARCS if yes then which version?

      Thanks in advance.



      • Wayne Paffhausen

        Yes.  19.10 for Oracle EPM Cloud products that have DM.


      • Tim Gaumont

        Hi Sachin - that reference in the documentation means the agent can currently be downloaded from the UI inside Planning or Consol & Close (FCCS) under the Data Integration card, however it can be used with any EPM business process (for example ARCS) since it uses EPM Data Management for integration as well.  We plan to add the Data Integration card to ARCS in the next few months as well (safe harbor applies).


        • Adrian Ward

          Hi Tim

          If we are not getting the Data Integrator UI in ARCS for a few months, how do we add scripts to the Agent through Data Management. It looks like that can only be done in the Data Integrator UI?