Get Involved. Join the Conversation.


    Hiten Shah
    Supplier Integration using OIC
    Topic posted July 8, 2019 by Hiten ShahRed Ribbon: 250+ Points, tagged Connection, ERP Cloud, Integration, REST, SOAP 
    212 Views, 6 Comments
    Supplier Integration using OIC
    Import in sequence Supplier, Supplier address, Supplier Site, Supplier Site Assignment, Supplier Contacts

    Hello All,

    We have a requirement to implement Supplier Integration using OIC.   Approach I am thinking is to create a Wrapper Orchestration which will call child orchestrations to load to UCM - Interface and Final tables for Supplier, Supplier address, Supplier Site, Supplier Site Assignment, Supplier Contacts.  This integration must execute after the first one is successful.  After each integration will Check the ESSJOBStatus for previous call and then trigger the next integration.  

    My question is if you have done similar integration in OIC

    1) Using the UCM approach I am unable to identify if a specific supplier is created or not, since we read the entire CSV at once.  Is there any other way you have tried.  

    2) How do you manage to capture which supplier is created and which has error.

    3) Any other idea such REST API or any other solution.





    • Dale Barthold

      I would use the delivered FBDI templates and the loadAndImportData or loadBulkData web service. You will be able to use the delivered reports to see what loaded and failed

      • Hiten Shah

        Thanks Dale for the reply. 

        Yes correct, loadBulkData is the option I am leaning towards.  Two questions as below:

        (1) I had a counter thought as there 5 files (Supplier, Supplier address, Supplier Site, Supplier Site Assignment, Supplier Contacts ) need to be processed one after the other instead of creating 5 orchestrations (Using UCM-Interface-Final tables) is there a better way to do that.

        (2) Say we take importBulkData path, We plan to have single FTP folder with all the 5 files in CSV format.  So how do I control which file is read first Supplier or Supplier address.  One option is to use the filename and loop if the file name is not "Supplier" read next.  However i am trying to find a better way to handle this.  Your ideas are appreciated.

        Any idea ?

        • Dale Barthold

          OK, I have to make a lot of assumptions here so I can't give precise detail.

          #2 first - Do you have more than 1 of each file at a time, or is all data to be processed contained in one file each (1 supplier, 1 address, 1 site, etc.)?

          Can you create a separate FTP folder for each file? Then you can just pick the/all file(s) based on the path name. Don't remember if you can use wildcards on getting the file could that be an option? also, if you are using looping, you could use a variable for the path/file name and change it for each step?

          #1 - What is the requirement for a supplier that fails the integration process? Are you expected to 'fix' the problem or just report this information to someone?Since each supplier is dependent on the previous supplier step completing successfully, you are going to get a lot of errors when a previous supplier step fails. (i.e. if address fails, then site, site assign, banks, etc are going to fail for that supplier as well.)

          On a failure, I would just email the import reports to the supplier management team, have them correct the data and resubmit the supplier in full. If part of the supplier was created, it will error as a duplicate, but the portions of the supplier that did not get created the first time will get created.

          The other thing to work out is that each file needs to complete before you run the next. Are you using a call back to determine when you can fire off the next file? how are you controlling the timing of execution of the next set of files?


          One other option is to look at the Supplier SOAP Web Service instead of FBDI?

          • Hiten Shah

            Thanks Dale for your help and detailed reply.  Appreciate it.

            #1 first - Yes we will have multiple files.  I suggested using separate FTP folder for Supplier, Supplier Address, Site etc.  However the preference is to have single folder for all files.  Create a Wrapper Orchestration and Internally figure out based on file name - If supplier then call Supplier Orchestration.  Similarly if "Supplier Address" then call Supplier Address Orchestration.  ( I am still working on this plan so not sure if single folder will work using filename)

            #2 If Supplier fails currently we have to just report.  I have already captured the log files and downloaded to FTP Log folder for easy user access.  I am unable to email the reports.  If I can email the Import reports, that would be amazing.  Do you have any idea how Do I email the reports ?

            #3 Yes each file needs to complete before next.  I am using a wrapper integration.  Consider this scenario: I execute the Supplier integration. I load the data to UCM and then to Interface tables.  Finally I use the ImportBulkData and I check the status of Supplier.  I use a While loop to check the status of ESSJobStatus.  If it is "SUCCEEDED" Then I call the next Integration. Attached is the screen dump for SubmitESSJobRequest & GetESSJobRequest. 

            Note: This is not yet tested so not sure if this works, I am developing this.  

            Is there any other way to verify this as mentioned by you "Call Back".  Can you guide for that approach

            Summarization all of my Questions for you:

            1) Any feedback on my point 1 plan

            2) Do you have any idea how Do I email the reports ?

            3) What is using "Call Back" approach.  Can you guide for that approach.  Is it some custom code or OIC ?

    • Dale Barthold

      #1 first - You can work through the file selection process. Either by name, wildcard, folder structure, etc. Remember the .zip file can be any name you want, just the .csv files that need to be a specific name.

      #2 emailing reports. You can replicate the delivered reports and use bursting to email them to users or groups of users. Search for bursting, there is a ton of info on this.

      #3 - Are you running separate web services to copy file to UCM, then load to interface, then import to the base tables? Or are you using importBulkData in ErpIntegrationServiceSoapHttp?

      With ErpIntegrationServiceSoapHttp importBulkData, the Soap payload actually contains the .zip file in the Content element of the payload. It copies the files to the UCM server and runs the Load and Import ESS jobs.


      If you use a while loop, then does that mean that you are going to keep checking to see if the ESS process is finished before you invoke the next step? A call back will send you the status of the parent and child processes when they are finished, regardless of status. Here is a PDF that covers callback, there is a lot of other stuff on call back on the web.

      If you are checking for status remember that it does not need to be success in order to call the next step. If you have 100 suppliers and 5 fail, don't you want to continue on to load the next step for the 95 that were good? Or do you have 1 file per supplier? If you are doing 1 file per supplier, then maybe just use direct web services instead of BulkLoad?

      • Hiten Shah

        Hello Dale,

        1) Yes we are using separate webservices to copy file to UCM, Load to Interface and Import to tables.  I have never used the importBulkData in ErpIntegrationServiceSoapHttp.  So is this a single call instead of 3 calls ?  Is this a better approach over separate webservice call ?

        2) Yes we use while loop.  I have not tried the call back routine.  One question on Call back routine - Is this to be written in Java or JavaScript ? Sorry for the layman question if you feel so.  Never tried that.

        3) You are correct of 100 suppliers if 5 fails, we have to load other.  I will check with our team.  Its not one supplier per file.

        4) I had direct the REST API to create Supplier.  I am getting an error as below so I am stuck there.  We have created the custome role given the required rights.

        /fscmRestApi/resources/ returned a response status of 400 Bad Request]]><![CDATA[The action "create" is not enabled..A 400 Bad Request Error indicates that the target service is unable (or refuses) to process the request sent by the client (Oracle Integration Cloud), due to an issue that is perceived by the server to be a client problem.