Order import fails with duplicate record errors in order-mgmt module during FBDI data loads

Our FBDI order import process is consistently throwing duplicate record errors even though we’ve validated our source data for uniqueness. The imports were working fine until last week when we started getting these failures:


Error: ORA-00001 unique constraint violated
Table: DOO_HEADERS_IFACE_ALL
Constraint: DOO_HEADERS_U1

We’ve checked our Excel templates and confirmed no duplicate order numbers exist in the source files. The staging table cleanup process runs between imports, but we’re still hitting these constraint violations. Need help with FBDI template validation procedures and understanding what might be causing phantom duplicates in the staging tables.

This usually happens when previous import jobs didn’t complete successfully and left orphaned records in the interface tables. The staging cleanup might not be catching everything. Have you checked the status of records in DOO_HEADERS_IFACE_ALL directly?

Don’t manually delete from interface tables - that can cause data integrity issues. Instead, use the ‘Delete Bulk Import Data’ scheduled process to clean up properly. This ensures all related child records are removed too. Run it with ‘Error’ status filter to clear out failed imports.

Also check if you have any concurrent imports running. If two imports with overlapping data run simultaneously, you’ll get constraint violations even with clean source data.

Check your batch size configuration too. We had issues where large batches would partially fail, leaving some records committed and others rolled back. This created duplicate situations on retry. Reducing batch size to 500-1000 records helped us avoid these partial commit scenarios.

I’d also verify your unique constraint definition. The DOO_HEADERS_U1 constraint typically includes SOURCE_ORDER_SYSTEM and SOURCE_ORDER_NUMBER. If your template isn’t populating SOURCE_ORDER_SYSTEM consistently, you might get unexpected collisions.

Your issue involves multiple layers that need systematic troubleshooting. Let me address each focus area:

FBDI Template Validation: First, verify your template structure matches the current version requirements for 23B. Oracle updates interface table structures between releases, and old templates can cause unexpected issues. Download a fresh template from the Import Orders task and compare column mappings. Pay special attention to these critical fields:

SELECT SOURCE_ORDER_SYSTEM, SOURCE_ORDER_NUMBER,
       COUNT(*) FROM DOO_HEADERS_IFACE_ALL
GROUP BY SOURCE_ORDER_SYSTEM, SOURCE_ORDER_NUMBER
HAVING COUNT(*) > 1;

This query will show actual duplicates in staging. Validate that your Excel template populates SOURCE_ORDER_SYSTEM consistently - blank or null values can cause the constraint to behave unexpectedly. Also check for hidden characters or trailing spaces in your source data that might make seemingly identical values actually different.

Staging Table Cleanup: The standard cleanup isn’t always sufficient. Implement a comprehensive cleanup procedure:

DELETE FROM DOO_HEADERS_IFACE_ALL
WHERE PROCESS_STATUS IN ('ERROR', 'REJECTED')
  AND CREATION_DATE < SYSDATE - 1;

Run this before each import batch. However, the proper approach is using the ‘Delete Bulk Import Data’ ESS job with parameters: Process Status = ‘Error’, Age = ‘1’ day. This handles cascade deletes for child line and schedule records. Schedule this job to run nightly during maintenance windows.

For stuck records that won’t delete normally, check for locks:

SELECT s.sid, s.serial#, s.username
FROM v$locked_object lo, dba_objects ao, v$session s
WHERE ao.object_id = lo.object_id
  AND lo.session_id = s.sid
  AND ao.object_name = 'DOO_HEADERS_IFACE_ALL';

Unique Constraint Troubleshooting: The DOO_HEADERS_U1 constraint enforces uniqueness on (SOURCE_ORDER_SYSTEM, SOURCE_ORDER_NUMBER, SOURCE_ORDER_ID). Your error indicates collision on this combination. Common causes:

  1. Null SOURCE_ORDER_ID: If this column is null, multiple records with same SOURCE_ORDER_SYSTEM and SOURCE_ORDER_NUMBER will violate the constraint. Ensure your template explicitly populates this field.

  2. Case sensitivity: Oracle treats ‘ORDER-001’ and ‘order-001’ as different, but your validation might not. Implement UPPER() standardization in your data preparation.

  3. Concurrent imports: If you’re running multiple import jobs in parallel, they can collide. Implement serialization using a custom locking mechanism or schedule imports sequentially.

  4. Uncommitted transactions: Previous failed imports might have uncommitted data. Query for records with null LAST_UPDATE_DATE or very old CREATION_DATE values.

Additional troubleshooting steps:

  • Enable diagnostic logging for the Import Orders ESS job (set logging level to STATEMENT)
  • Review the ESS job log file for the exact record causing the violation
  • Check if you’re using the correct Business Unit context - wrong BU can cause unexpected duplicate detection
  • Verify that your source system isn’t reusing order numbers across different order types

Implement pre-validation in your ETL process before loading to FBDI. Create a staging table in your integration layer and run duplicate checks there before pushing to Oracle interface tables. This prevents wasted processing and gives you better error visibility.