For partners that build or integrate commercially available applications and service solutions with the Oracle Cloud Platform
For partners that provide implementation or managed services around Oracle Cloud Applications
Hi,
We have a CSV with 10,000 rows. We loop through each one and depending on whether it exists as a CO, we either update or insert the record. There are about 20 columns.
At around 230 rows, we get the error
Exception: line 222: Cannot save/update: NameCustomObject(ID=14239): DB API Error
Is there any setting we need to adjust or any workaround?
Thanks,
JJ
Comment
Oh I have struggled through this error for a long time. My workaround was Data Import Wizard + CPMs.
You are probably uploading a CSV in a Custom Script, right? Well its going to timeout at some point of time or throw these DB API Errors.
Try adding a RNCPHP\ConnectAPI::commit();
Anyway, here's what you should do.
Upload the CSV through Data Import Wizard and execute CPMs for each record. Make sure the CPM is Async.
The Data Import Wizard will finish the import of 10,000 records within a minute or two. But the CPM for these 10,000 records may take longer to finish executing. It will keep processing in the backend. Be patient and keep checking the logs.
Average Rating:



2 ratings
|
Sign in to rate this
Thanks - the process is already using the data import wizard but it's very much a manual process that we want to avoid.
The plan is that "System A" dumps the CSV to the WebDav and then "Process A" calls the controller to process the data, all automated, all scheduled.
I tried the commit() and it didn't make any difference. It gets through 200+ fairly quickly
Be the first to rate this
|
Sign in to rate this
Hi Janusz.
Just a word of caution (since you mentioned webdav), I recently looked into automated CSV import myself and found an old topic that said:
I'm not sure if this applies to your use case but thought, never hurts to mention right? :-)
Be the first to rate this
|
Sign in to rate this