Transportation Management

Get Involved. Join the Conversation.


    Anand Sharma
    OTM Cloud 6.4.2-Cutover plan for Go Live
    Topic posted September 20, 2018 by Anand SharmaGreen Ribbon: 100+ Points, tagged Cloud, Transportation Management 
    466 Views, 6 Comments
    OTM Cloud 6.4.2-Cutover plan for Go Live
    Share experience on Cloud cutover


    Can anyone share their experience of OTM cloud cutover activity:

    1. Business Cutover

    2. Technical Cutover



    OTM Cloud 6.4.2/18c



    • Kelly Cooper

      Please share if you have any experience.  That's what this forum is for :)

      • Anand Sharma

        Hi Kelly

        we are still working towards it and project is still on and can only share once we have completed


    • Joseph Callan

      The posted topic would prompt a few other questions.  Here are some questions I would have for Anand -

      1. Are you referring to a transition from an on-prem OTM deployment model?

      2. Are you referring to a company already live on OTM Cloud and you are wondering the impacts of the 6.4.2 upgrade?



      We moved a onprem (but oracle hosted) solution to SaaS version last month. We moved from 6.3.7 to 6.4.2 below are key observations,

      Prep work,

      1. you need to remove all custom techniques (JavaScript/Modified Servlets)  in OTM. Redesign processes which used custom procedures. 
      2. If you have any custom data extractions plugged into current database using jdbc or any other database connectivity ,best time to redesign them.
      3. No more stand alone pcmiler supported, need to get a webservice based pcmiler license.
      4. One user typically will have one email so you really can't create multiple user ids for users any more.Same applies to carriers. This can be a process change as well as technical change if any business cases need to access more than 1 domain. 
      5.  6.3 versions have columns named as XX, ideally they should have been removed as part of prior migration activities . Making sure you don't have XX columns in 6.3 database before going for migration

      Data migration:

      1. Based on retention requirement and data volume you want to carry over to cloud this can be challenging.
      2. Identify how you are going to move data
        1. Use DIPC (If your solution hosted in oracle) 
        2. Use custom data replication tools (if you have full access to current db )
        3. Use manual uploads (Not accurate, need validations,need solid methods to handle deleted data)
        4. Use custom scripting  (We written python scripts using csvutil and dbxml techniques to pull and push data between on prem and cloud
      3. Separate data between Config,Master,Transaction categories 
      4. If you use manual uploads watch out for,
        1. Linefeed data in remarks and other free text fields. 
        2. Quote or apos in identifiers and description.
        3. Take care of leading zeroes. 
        4. Take care of rounding the digits
      5. If you use dbxml it will not carry over insert date,insert user ,update date and update user.DBXML has serious size limitations. 
      6. If you use csvs also many tables has triggers which will prevent to carry over foot print columns.Triggers can be tough to deal with, for example capacity usage has a trigger which will not allow few shipment level table data as capacity is already met but we are doing a migration and it doesn't matter to the trigger.
      7. If you are doing manual uploads you will need to spread the data migration in batches and you need to build solid technique to replicate data deletions. Especially batches towards the end of the cycle will be challenging.


      1.  Fortunately not many changes noticed. Only key change is all integration users need to be switched from interactive roles to a new role called INTEGRATION 
      2. Dataqueue setup is turned on by default, don't use it if you already have integration systems which does work for you
      3. No major xsd changes noticed.
      4. However, you will not be able to see entire xml in transmisison manager. Only transmission body or transaction body shown based on choice. 


      1. Most of the 6.3 agents will work as-is in 6.4.2. 
      2. In very rare cases we noticed direct sqls showed few issues ,mainly because of data migration issues.
      3. Use migration project to export and import agents,saved queries,saved conditions,complex expressions,menus,screen sets
      4. If you are using same events across domains check them twice. Review agents which are set up to run as objects.
      5. Make a note that, application default time is going to change from x to UTC. so if you have any jobs or sqls which were based on that understanding needs to be changed.


      1. No more lexical parameters
      2. BI publisher separated
      3. If you have lot of reports which are parameter driven, this will take good enough time to rewrite them. 
      4. SQL timeouts,no more long running reports
      5. For simple reports it's now easier to create reports
      6. Point all current reports in otm to new report paths

      External Data feeders:

      1. If you have any external application which is reading/writing data from/to otm database redesign it. Same options as data migrations without the manual option. We opted for dbxml based approach built on aws solution.

      Data migration and prep work can be worked out in parallel streams ,redesigning all custom fixes and replacing with available otm components will give confidence on overall approach

      Overall, data migration is going to be tough battle.


      LakshmiDeepak Gaddipati


      • Anand Sharma

        thanks for the detailed email.

        We went live already in Jan, but I am sure new projects coming up will get benefit from your feedback.