For partners that build or integrate commercially available applications and service solutions with the Oracle Cloud Platform
For partners that provide implementation or managed services around Oracle Cloud Applications
I have an integration "App Driven Orchestration" where it starts with a rest adapter that receives a content similar to this one:
{
"filecontent": "1,3,120,0,145,125,225,345,147,452,10,5,1,100, Comment 1, Production, Y
2,6,120,0,145,135,250,365,194,504,20,11,2,101, Comment 2, Production, Y
3,9,120,0,145,145,255,385,241,556,30,15,3,102, Comment 3, Production, Y
4,12,120,0,145,150,260,400,288,608,40,21,4,103, Comment 4, Production, Y
5,15,120,0,145,155,265,425,335,660,50,26,5,104, Comment 5, Production, Y "
}
I have to separate line by line then, I decided to create a stage write and then a stage read and so to create a for each for stage read and go line by line I did the test with a "Scheduled Orchestration" integration and this works but when poor with The integration of type "App Driven Orchestration" reading operation throws a similar error as the screenshot attached in this post.
I have attached the .iar file
Agradesco your opinions and comments.
Regards.
Comment
Are you sure if the request is same? The IAR shared expects payload in {"request":"request"} format but the sample you gave above is of {"filecontent":"filecontent"}.
Also I'm not sure how a multi-line csv is passed in JSON value?
Be the first to rate this
|
Sign in to rate this
Sorry, the content of the payload is actually {"request": "request"} but the content is itself.
I have a screen in VB where with a file picker I read a CSV file using a JS function, this function stores all the contents of the file in a single variable and that variable is sent to the load of the OIC integration.
I did tests by writing the variable content in an FTP file and write the content correctly.
However from one moment to another the settings for stage write and stage read stopped working.
Be the first to rate this
|
Sign in to rate this
Can you enable trace for the integration and provide the logs to look at?
Be the first to rate this
|
Sign in to rate this
Hi Hemanth,
I have attached the logs
Be the first to rate this
|
Sign in to rate this
I changed the name to the integration now called Test_LineByLine3.
I update the .iar file.
Be the first to rate this
|
Sign in to rate this
The design of the flow, somehow doesn't seem to be right for me. Try below:
1. Create stage Write with opaque schema and map the opaque element with base64 encode of content value from request.
2. Create stage Read using same directory/file name as stage write. Make sure the sample CSV is exactly of the same content you are receiving in content value from request.
The read response should have the repeating element of csv rows.
Be the first to rate this
|
Sign in to rate this
Hi Hemanth
What is the objective of using an opaque scheme?
I do not receive an element encoded in base 64.
As for using the same directory / file name I'm assigning those two values with the properties of stage write so there should be no problem.
Be the first to rate this
|
Sign in to rate this
Logs do not have the trace of the flow.
Be the first to rate this
|
Sign in to rate this