Integration

Get Involved. Join the Conversation.

Topic

    Songqing Gu
    How to write FTP in Stage File Read (in Segment)Answered
    Topic posted May 20, 2019 by Songqing GuGreen Ribbon: 100+ Points, last edited May 20, 2019, tagged Adapters, Cloud, Integration, Mapping, PaaS 
    456 Views, 14 Comments
    Title:
    How to write FTP in Stage File Read (in Segment)
    Summary:
    Issue with stage read in segment and with FTP write in Stage File Read (in segment)
    Content:

    I am getting stuck with writing CSV records to FTP in a Stage File Read (in Segment).

    My case is:

    1. query DB with more than 1000 records in response;

    2. Loop over each record, apply mapping logic to current record, and then write into staging file;

    3. After the loop use Stage File Read - in segment, then write into remote file system by FTP adapter.

     

    FTP adapter complains with error as below:

    com.oracle.bpel.client.BPELFault: faultName: {{http://schemas.oracle.com/bpel/extension}remoteFault} messageType: {{http://schemas.oracle.com/bpel/extension}RuntimeFaultMessage} parts: {{ CODE={http://schemas.oracle.com/bpel/extension}remoteFault ,SUMMARY= ICS runtime execution error ,DETAIL= Translation Error. Translation Error. Error while translating message to native format. Please make sure that the payload for the outbound interaction conforms to the schema and payload size does not exceed threshold. Error occured as {1} :Application Error WriteFile } cause: {null} 

     

    Regards

    Songqing

    Best Comment

    Dinesh Pant

    In your stage read in segment, you have xsd which is using first line as header. So the file written in stage has only one row.

    You can verify the file written in stage by writing the same file to FTP.

    Comment

     

    • Hemanth Lakkaraju

      Error while translating message to native format. Please make sure that the payload for the outbound interaction conforms to the schema and payload size does not exceed threshold

      Are you sure the payload size is within limit and all the required fields are available in the CSV? Can you attach the complete diagnostics logs to see the actual error?

      • Songqing Gu

        Thanks for your quick response, Hermanth!  

        1. overall transformed size is larger than 1 MB, that is why I choose to use Stage write and followed by Stage read in segment. I start with one record, still facing issue.

        2. Attached diagnostics logs for your reference: Process Name - ReadEmployee_CP; User - s.gu@acceture.com

        3. some typical errors:  

        some small volume - Error while translating message to native format.

        large data volum - Error while reading native data. [Line=62, Col=105] Expected "," at the specified position in the native data, while trying to read the data for "element with name StreetAddress", using "style" as "terminated" and "terminatedBy" as ",", but not found. Ensure that ",", exists at the specified position in the native data. 

         

        Regards

        Songqing

        • Hemen Shah

          Looks like there is an issue with Stage, some where you are not getting value in you CSV where you have metioned that column as mandatory.

          I can see error "minOccurs not satisfied" in your log.

          • Songqing Gu

            Hi Hemen

            Thanks for your response and your nice spot.  but this error confuses me as well, cause there is only one compulsory field with rest of all  optional.  it is mapped in code for sure.  

            Regards

            Songqing

            • Hemanth Lakkaraju

               Error while reading native data. [Line=62, Col=105] Expected "," at the specified position in the native data, while trying to read the data for "element with name StreetAddress", using "style" as "terminated" and "terminatedBy" as ",", but not found. Ensure that ",", exists at the specified position in the native data.

              Did you verify line 62 of csv to see if the total values of commas in the line are in right count to number of elements given in sample?

    • Amit Singh

      Hi ,

      Please try to write it to local file system of OIC. Once completed then use move operation to write the file to FTP. As this way it will be much faster, your stage write should happen to local stage directory.

       

      Regards,

      Amit Singh

      • Songqing Gu

        Hi Amit

        Thanks for your response and suggestion.  

        But not sure how to write to local file system and use move operation to write to FTP. Do you have a kind of POC to show me how to do it?

         

        Regards

        Songqing

        • Monish Munot

          Songqing, Make use of stage file activity with write operation to write to local server i.e. OIC servers and then make use of FTP adapter to move it to actual location.

          • Songqing Gu

            Hi Monish

            Thanks for your suggestion!  But your suggestion brings up two concerns:

            1. Does FTP adapter works with staging file?  staging file looks like internal to platform.

            2. Even you are able to move, then do this you still need to map the file with Oracle native format to application specific format.

            Regards

            Songqing

            • Monish Munot
              1. No need to use FTP adapter, just drag stage activity and write file to internal platform.
              2. Once you are done with writing all the data you write the file to FTP location using file reference mapping and no need to do any particular mapping.
    • Dinesh Pant

      Write operation is not getting any data and its nxsd is expecting min 1 record.

      <WriteFile xmlns:ns1="http://xmlns.oracle.com/cloud/adapter/ftp/WriteEmployees_REQUEST" xmlns:nsmpr6="http://TargetNamespace.com/fileReference/StageWriteFIle" xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" xmlns:nsmpr2="http://xmlns.oracle.com/cloud/adapter/ftp/WriteEmployeesHeaders_REQUEST/types" xmlns:xml="http://www.w3.org/XML/1998/namespace" xmlns:nsmpr5="http://xml.oracle.com/types" xmlns:nsmpr0="http://xmlns.oracle.com/cloud/adapter/database/QueryEmployeeListF060116_REQUEST/types" xmlns:nsmpr1="http://xmlns.oracle.com/cloud/adapter/stagefile/ReadingInSegment_REQUEST/types" xmlns:nstrgmpr="http://xmlns.oracle.com/cloud/adapter/ftp/WriteEmployees_REQUEST/types" xmlns="http://xmlns.oracle.com/cloud/adapter/ftp/WriteEmployees_REQUEST/types">
         <nstrgmpr:OutboundFTPHeaderType>
            <nsmpr5:fileName>Emp_Master.csv</nsmpr5:fileName>
            <nsmpr5:directory>/u01/INTERFACES/TEST/KRONOS/INPUT</nsmpr5:directory>
         </nstrgmpr:OutboundFTPHeaderType>
         <nsmpr6:EmployeesMaster/>
      </WriteFile>

      Here we can see FTP did not get any record.

       

      • Dinesh Pant

        In your stage read in segment, you have xsd which is using first line as header. So the file written in stage has only one row.

        You can verify the file written in stage by writing the same file to FTP.

        • Songqing Gu

          Hi Dinesh

          You just spotted the root cause of my problem, xsd definition. it causes to fail to transfer one line file and also bring up wired issues with multiple line data file transfer.  Bty this schema setting works with FTP Stage read (Entire file), with first line data missing. 

          I have made some changes to my schema and delivered my work.

          Thanks for your help, Dines!  

          Songqing