We’re submitting large batches of capacity plans through the Oracle Fusion Cloud 23C REST API for production scheduling. Individual plan submissions complete in 2-3 seconds, but when processing batches of 200+ plans, the total time exceeds 15 minutes.
The API appears to process plans sequentially rather than in parallel, and we’re seeing significant latency between submissions. We’ve tried both synchronous POST requests in loops and bulk submission endpoints, but performance remains poor.
This slow processing is impacting our supply chain operations. Our planning system generates 500-800 capacity plans daily that need to be synced to Fusion Cloud within a 30-minute window. At current speeds, we’re missing our synchronization deadlines by over an hour.
We’re not hitting any explicit rate limit errors - requests succeed, they’re just extremely slow. Are there bulk import alternatives or API optimization strategies we should consider for high-volume capacity planning scenarios?
FBDI provides job status APIs you can poll. Submit your file, get a job ID, then poll the import status endpoint every 30-60 seconds. When complete, you can download the success/error report. It’s async but manageable. We process 1000+ capacity plans daily this way with full error handling and reconciliation.
The performance issues you’re experiencing are related to how Oracle Fusion Cloud’s REST API handles batch operations versus bulk data import capabilities. Here’s a comprehensive analysis:
API Batch Processing Speed Limitations:
The REST API processes capacity plans with significant per-request overhead including authentication, validation, and database commits. Even with bulk endpoints, you’re limited by:
- Network latency per HTTP request (100-300ms)
- Individual record validation cycles
- Synchronous processing model requiring response before next submission
- Database transaction overhead per API call
For 500 plans at 2-3 seconds each, you’re looking at 16-25 minutes minimum with sequential processing.
Rate Limits and Throttling:
Oracle implements adaptive throttling that doesn’t always return 429 errors. Instead, it increases response times when detecting high-frequency patterns. Your 12-minute bulk processing time suggests you’re hitting these soft limits. The system is deliberately slowing your requests to protect backend resources.
Bulk Import Alternatives - FBDI Recommended:
File-Based Data Import (FBDI) is specifically designed for high-volume scenarios:
- Upload CSV with 500-800 plans in single operation
- Asynchronous processing handles validation in parallel
- Typical processing: 500 records in 3-5 minutes
- Reduced network overhead (one upload vs 500 API calls)
- Built-in error reporting and reconciliation
FBDI Process:
- Generate CSV file with capacity plan data
- Upload via /fscmRestApi/resources/ImportBulkData
- Poll job status using returned process ID
- Download validation report when complete
Impact on Supply Chain Operations:
Your 30-minute sync window requires processing 800 plans in under 25 minutes (allowing buffer). REST API cannot meet this SLA. FBDI can process the same volume in 5-7 minutes, giving you 20+ minutes buffer for error handling and retries.
Recommended Architecture:
- Use FBDI for daily bulk synchronization (500-800 plans)
- Reserve REST API for real-time individual updates (urgent changes, corrections)
- Implement polling mechanism for FBDI job status (30-second intervals)
- Build error reconciliation workflow using FBDI error reports
- Set up alerts when FBDI jobs exceed 10-minute threshold
This hybrid approach gives you both bulk efficiency and real-time capability where needed. Most Fusion Cloud implementations handling high-volume supply chain data use this pattern successfully.
Are you submitting plans one at a time in a loop, or using the bulk endpoint? The individual submission approach will definitely be slow due to network overhead. Also check if you’re getting throttled - Oracle might be rate limiting your requests even without explicit errors.
The REST API isn’t the optimal path for high-volume capacity planning data. Have you considered using File-Based Data Import (FBDI) instead? FBDI is designed for bulk operations and processes much faster than REST API for large datasets. You can upload CSV files with hundreds of plans and they’ll process asynchronously in the background. Typically 500 plans via FBDI complete in under 5 minutes including validation.
Looking into FBDI now. Is there a way to get real-time status updates with FBDI, or is it purely batch/async? We need to know when plans are successfully loaded for downstream processes.
We’ve tried both approaches. Individual submissions in parallel threads (10 concurrent) and the bulk endpoint that accepts arrays. The bulk endpoint is slightly faster but still takes 12+ minutes for 500 plans. No explicit throttling errors in responses.
REST API rate limits in Fusion Cloud aren’t always explicit - there’s adaptive throttling that slows down high-frequency requests without returning 429 errors. You might be hitting soft limits. FBDI is definitely the recommended approach for bulk capacity planning. The REST API is better suited for real-time individual updates or small batches under 50 items. For your 500-800 daily plans, FBDI will give you 3-5x performance improvement.