Transportation Management

Get Involved. Join the Conversation.


  • Karl Baker

    The 19B/C feature Consider Service Provider Capacity Across Days provides you with the ability to configure OTM to consider using capacity limit capacities that are available on different days.  You can encourage or discourage the use of capacity on different days for both cost based decisions as well as priority based decisions.

    The full description from the What's New document for 19B is below...


    This feature provides you with the ability to configure a policy that will allow OTM to consider using capacity limit capacities that are available on different days.  With this feature, you have configuration capabilities that will allow you to encourage or discourage the use of capacity on different days – for both cost based decisions as well as priority based decisions.

    Shippers who would like to take advantage of lower cost carriers with limited capacity and still manage the priority of the shipments (High, Medium, Low) now have a place to configure a policy on how to configure the delay to shipping based on priority and how long of a delay is allowed.

    New Parameters

      • This parameter determines how many extra possible start days OTM will consider for each shipment that is subject to capacity limits. 
      • This parameter allows you to configure your policy for changing a shipment's start date in order to take advantage of (future) capacity.
        •  For example, given a service provider with the least expensive rate on a lane and a capacity of one truck per day. Assuming that today's capacity has already been used, and there is another shipment that could be assigned to the least cost service provider if the service provider had capacity - does it make sense to change the start date of today's shipment to tomorrow - assuming that the change is feasible - to use tomorrow's capacity - or does "delaying" the start time by a day add an unacceptable service time failure risk by eliminating the available slack time in the shipment timing?  
      • The available SPA TIME CHANGE POLICY options are:
        • 3. Encourage Time Change.
        • 2. Discourage Time Change For High & Medium Priority Shipments'
        • 1. Discourage Time Change For High Priority Shipments'
        • 0. Discourage Time Change


    Assuming that you are already planning with capacity limits - to extend capacity limits to consider using capacity availability on different days you will want to configure the following parameters which can be found in the Service Provider Assignment group:

      • This parameter determines how many extra possible start days OTM will consider for each shipment, subject to capacity limits. When using this parameter, it is best to set this no larger than needed, as it can have a performance impact. For example, if two extra days is sufficient to find the right resources, then there is no reason to set it higher than 2.
      • The default is "0" (zero). See the Service Provider Assignment and Resource Management topic, Service Provider Assignment Time Window Functionality section, for more details.
      • This parameter provides priority level control about whether OTM should discourage delaying the shipment start date in order to take advantage of a cheap but capacity-limited service provider. For example, there is a cheap carrier that has only one truck per day. Should I delay my shipment departure to tomorrow to get the cheap truck, or should I let it ship today on an expensive truck?  The options are:
        • '0. Discourage Time Change',
        • '1. Discourage Time Change For High Priority Shipments',
        • '2. Discourage Time Change For High & Medium Priority Shipments',
        • '3. Encourage Time Change'.



    SPA MAX DAYS TO CONSIDER -  When using this parameter, it is best to set this no larger than needed, as it can have a performance impact. For example, if two extra days is sufficient to find the right resources, then there is no reason to set it higher than 2.


  • Karl Baker
    • OTM does not have the Int Saved Query ability on every element. Int Saved Query is used for finding gids. Currently Int Saved Query is available only on the Parent Invoice element and is meant to find the parent invoice. 

    Why do you need this replacement?

    • Consolidated Invoice approvals
      1. "After parent invoice approvals, voucher is generated but the cost of voucher is zero (I assume by this time child invoices are not completely processed hence the cost is zero), if approve after few mins then voucher is generated with correct amounts."   Can you please provide a scenario for this.  The voucher should be created with total net amount due of the consolidate invoice. Not sure why it is being shown as zero.
      2. How to validate each invoice cost line with related shipment cost line?
        1. As of now there is hard reference between the shipment costs and invoice lines.
        2. When OTM checks for line level tolerance, the assumption was that each of the shipment costs and invoice line would be unique for the combination of Cost Type, Accessorial Code, Payment Method and General Ledger code.
        3. If the costs and invoice line are provided in this manner then each of shipment cost can be matched to invoice line.
        4. But for this combination (Cost Type, Accessorial Code, Payment Method and General Ledger code) if there are multiple costs or multiple lines, OTM does not have any effective way of matching the costs and lines. The logic tries to sort each of the groups costs and lines by cost and starts matching each with the other and checks for tolerance.
        5. Once the tolerance checks are done, when creating the voucher, we just create the voucher with the invoice line items as voucher invoice line items and the sum of each of voucher invoice line items as the total amount of voucher.


  • Karl Baker

    Suggestion - based on your use case - could you try adding a PRE action check (v post) based on the invoice status that gets triggered before approving an invoice? 

  • Karl Baker

    As noted in your SR - the workaround while this issue is being investigated is to include all the referenced xsd files and use a physical wsdl file in lieu of the URL based WSDL.

  • Karl Baker

    From a product documentation perspective we do not generate or maintain a set of BPMs for OTM and GTM - we do, however, generate and maintain a large (and expanding) set of How To/Configuration Topics in Help which provide the standard configuration for setting up OTM and GTM to solve common business use cases and scenarios.

    OTM How To/Configuration Topics

    • Setting up OTM
    • Bulk Plan Performance and Tuning
    • Business Number Generator
    • Order Management
    • Promote to Production (P2P) Process
    • Messages
    • Configuring the User Interface
    • Tips and Troubleshooting
    • About Client Session Management
    • About Grouping Location Resources
    • About Oracle Transportation Mobile
    • About OTM Sending Hazmat Information to External Distance Engines
    • About Sending Equipment Dimensions to External Distance Engines
    • About Service Providers Accessing Workbench
    • Adding and Applying US HOS Rules
    • Allocation of Costs and Assigning General Ledger Codes
    • Configuring a Fuel Surcharge
    • Configuring an External Web Service Engine
    • Configuring and Processing a Credit Check
    • Configuring European VAT
    • Configuring and Using Rate Preferences
    • Configuring OTM to Integrate with Kewill Flagship Parcel Rates
    • Configuring Shipment Groups
    • Configuring Tiered Rating
    • Configuring VAT Components
    • Creating Agents for Report Sets
    • Creating an Automation Agent
    • Creating a Calendar
    • Creating a Multistop Order
    • Creating Canadian Postal Codes
    • Creating Rates for Zones
    • Creating Saved Queries
    • Generating Shipping Forms
    • How to Assign General Ledger Codes
    • How To Configure Conditional ing
    • How to Configure Cross-Docks and Pools
    • How To Configure Dispatch Level Tendering
    • How to Configure Shipment Depot Stops
    • How To Create a Rule 11 Shipment
    • How To Receive Migration Project Export Packages via Email and FTP
    • How To Set Up and Execute Cooperative Routing Aggregation
    • How To Set Up Milestone Templates and Monitors
    • How To Set Up Notifications for Additional Contacts for Tender Events
    • How To Set Up SMS Text Messaging
    • How To Enable the Use of Parameterized Saved Queries
    • How To Use an XML Template
    • How to Settle an Invoice
    • How to Set Up an SMC Rate
    • Index Based Accessorial
    • Logistics Network Modeling Overview
    • Purge Actions
    • Settling a Bill
    • Setting Up Interfaces for Integration
    • Sending Messages to the Message Center
    • Spot Bid Tenders Configuration
    • Using Advanced Layouts
    • Using Maps
    • TI How To/Configuration Topics
      • Using Transportation Intelligence for an End User
      • How to Define a Default Dashboard in Transportation Intelligence
      • How to Provide Access 

    Flow Topics

    • Conditional Booking Process Flow
    • Demurrage Flow
    • Network Routing Flow
    • Order Movement Build Shipment Process
    • Order Release Equipment
    • Order Release Flow
    • Rate Maintenance Process Flow

    GTM How To/Configuration Topics

    • Restricted Party Screening
    • Landed Cost Simulator
    • Implementation of EAR Control Process in GTM
    • License Screening
    • Duty and Tax Calculation on Trade Transaction or Declaration
    • Landed Cost Estimation on Trade Transaction or Declaration
    • AES Export Declaration Filing Process
    • Trade Content Download
    • Oracle Transportation Management-Global Trade Management Integration
    • Trade Compliance Management
    • How to Use Sanctioned Territory Screening in GTM
    • Configuring and Processing Declarations
    • Product Classification Process
    • Supplier Solicitation
  • Karl Baker

    Materials look great. thanks... 

  • Karl Baker

    Can you send me the SR number.

  • Karl Baker

    The Running Manifest calculates the info as indicated by Chris no DB changes.   

    Just as an FYI -  the calculated running info that is sent to the External Distance Engine can be persisted to the Shipment_Stop in the fields below.  To generate/persist this info you would need to define the Truck Type for your Equipment Group for these values to be calculated.

    The following information is available:

    • The Running Weight, Running Length, Running Width and Running Height fields show the total weight, length, width and height  respectively from the current stop to the next stop.
    • The Out of Gauge (Left), (Right), (Forward) and (Rear) fields show the left width wise out of gauge, right width wise out of gauge, forward width wise out of gauge and rear width wise out of gauge details.
    Shipment_Stop Running Fields
  • Chris Peckham

    No, there are no new tables for this data.  OTM simply derives it from available information on the fly.

  • Karl Baker

    Are you using IDCS just to do Federated SSO across services?

  • Karl Baker

    What's the business problem you are solving here?

    If I understand the request -  you want to append shipment 2's stops (with the same source as shipment 1's stops) onto end of shipment 1 so that stop 1 is loaded for all the stops on the combined shipment (shipment 1 plus shipment 2) - but the sequence of the merged shipment will be the delivery sequence of shipment 1 plus the delivery sequence of the stops on shipment 2.  Interesting.  I don't see anything in the product that supports that requirement.

    If you explain the business scenario maybe something will come out of that - otherwise I would suggest submitting an Idea to the Idea Lab.   


  • Zomi Fathi
  • Zomi Fathi

    Attached a file with the TOI for continuous moves.



    We moved a onprem (but oracle hosted) solution to SaaS version last month. We moved from 6.3.7 to 6.4.2 below are key observations,

    Prep work,

    1. you need to remove all custom techniques (JavaScript/Modified Servlets)  in OTM. Redesign processes which used custom procedures. 
    2. If you have any custom data extractions plugged into current database using jdbc or any other database connectivity ,best time to redesign them.
    3. No more stand alone pcmiler supported, need to get a webservice based pcmiler license.
    4. One user typically will have one email so you really can't create multiple user ids for users any more.Same applies to carriers. This can be a process change as well as technical change if any business cases need to access more than 1 domain. 
    5.  6.3 versions have columns named as XX, ideally they should have been removed as part of prior migration activities . Making sure you don't have XX columns in 6.3 database before going for migration

    Data migration:

    1. Based on retention requirement and data volume you want to carry over to cloud this can be challenging.
    2. Identify how you are going to move data
      1. Use DIPC (If your solution hosted in oracle) 
      2. Use custom data replication tools (if you have full access to current db )
      3. Use manual uploads (Not accurate, need validations,need solid methods to handle deleted data)
      4. Use custom scripting  (We written python scripts using csvutil and dbxml techniques to pull and push data between on prem and cloud
    3. Separate data between Config,Master,Transaction categories 
    4. If you use manual uploads watch out for,
      1. Linefeed data in remarks and other free text fields. 
      2. Quote or apos in identifiers and description.
      3. Take care of leading zeroes. 
      4. Take care of rounding the digits
    5. If you use dbxml it will not carry over insert date,insert user ,update date and update user.DBXML has serious size limitations. 
    6. If you use csvs also many tables has triggers which will prevent to carry over foot print columns.Triggers can be tough to deal with, for example capacity usage has a trigger which will not allow few shipment level table data as capacity is already met but we are doing a migration and it doesn't matter to the trigger.
    7. If you are doing manual uploads you will need to spread the data migration in batches and you need to build solid technique to replicate data deletions. Especially batches towards the end of the cycle will be challenging.


    1.  Fortunately not many changes noticed. Only key change is all integration users need to be switched from interactive roles to a new role called INTEGRATION 
    2. Dataqueue setup is turned on by default, don't use it if you already have integration systems which does work for you
    3. No major xsd changes noticed.
    4. However, you will not be able to see entire xml in transmisison manager. Only transmission body or transaction body shown based on choice. 


    1. Most of the 6.3 agents will work as-is in 6.4.2. 
    2. In very rare cases we noticed direct sqls showed few issues ,mainly because of data migration issues.
    3. Use migration project to export and import agents,saved queries,saved conditions,complex expressions,menus,screen sets
    4. If you are using same events across domains check them twice. Review agents which are set up to run as objects.
    5. Make a note that, application default time is going to change from x to UTC. so if you have any jobs or sqls which were based on that understanding needs to be changed.


    1. No more lexical parameters
    2. BI publisher separated
    3. If you have lot of reports which are parameter driven, this will take good enough time to rewrite them. 
    4. SQL timeouts,no more long running reports
    5. For simple reports it's now easier to create reports
    6. Point all current reports in otm to new report paths

    External Data feeders:

    1. If you have any external application which is reading/writing data from/to otm database redesign it. Same options as data migrations without the manual option. We opted for dbxml based approach built on aws solution.

    Data migration and prep work can be worked out in parallel streams ,redesigning all custom fixes and replacing with available otm components will give confidence on overall approach

    Overall, data migration is going to be tough battle.


    LakshmiDeepak Gaddipati




    I am not sure what ERP you are using. But any ERP should have GL,COST/PROFIT center . So assigning GL/other attributes at order level and sending a allocation base and let ERP assign the cost to respective GL and COST/PROFIT centers.