I’ll address the key trade-offs across your three focus areas:
API Rate Limits: HubSpot’s standard 100 requests per 10 seconds translates to 864,000 requests daily. For 500K contacts, that’s sufficient if you’re only creating contacts. However, your complex data requirements (associations, custom properties, historical activity) likely require multiple API calls per contact:
- Contact creation: 1 call
- Company association: 1 call
- Deal associations: 1-3 calls depending on deal count
- Activity records: 1-5 calls per contact
This multiplies your API needs to 2-5M calls, requiring 3-6 days at full throttle - achievable within your two-week window. HubSpot typically approves temporary rate limit increases (up to 150-200 req/10sec) for migration projects, which would cut your timeline by 30-40%.
Bulk Import Latency: The unpredictability is your biggest risk. Bulk imports queue behind other operations in HubSpot’s processing infrastructure. We’ve observed:
- Best case: 6-8 hours for 500K contacts
- Typical case: 18-24 hours
- Worst case: 48-72 hours during high-load periods
Multiple imports (if errors require reprocessing) could consume your entire two-week window. Additionally, bulk import doesn’t support historical activity records - those require API calls regardless. This hidden requirement means you’ll use the API anyway, making pure bulk import infeasible for your use case.
Error Handling: This is where the approaches differ dramatically. API provides:
- Per-record error responses with specific failure reasons
- Immediate retry capability for failed records
- Programmatic error classification and routing
- Real-time monitoring of migration progress
- Incremental validation (test 1K records, then scale)
Bulk import provides:
- Post-processing error summary CSV (often hours after submission)
- Generic error codes requiring manual investigation
- All-or-nothing processing (can’t pause mid-import)
- Limited visibility into processing status
- No partial success tracking
For complex data with associations and custom properties, API error handling is essential. You’ll discover data quality issues that weren’t apparent in the legacy CRM - missing required fields, invalid property values, circular association references. API lets you handle these programmatically.
Recommendation for your scenario:
Implement a phased API-based migration with these optimizations:
-
Pre-migration validation (Days 1-2): Process 5,000 sample contacts via API to identify data quality issues. Build error handling logic for common failures.
-
Parallel processing architecture (Days 3-4): Implement multi-threaded API client respecting rate limits. Use 8-10 parallel threads to maximize throughput while staying under 100 req/10sec limit.
-
Batched contact creation (Days 5-9): Use the batch contact creation endpoint (up to 100 contacts per request) to reduce API call count by 100x. This brings your 500K contacts down to 5K API calls for contact creation.
-
Association and activity import (Days 10-13): After contacts exist, batch-create associations and activity records. Use the batch association API (50 associations per request).
-
Validation and cleanup (Days 14): Verify data integrity, handle any failed records, validate association completeness.
This approach:
- Completes within your two-week window (even without rate limit increase)
- Provides real-time error handling and monitoring
- Maintains data integrity through incremental validation
- Supports rollback and retry for failed subsets
- Gives you audit trail of exactly which records succeeded/failed
The batch endpoints are the key optimization - they provide API-level control with near-bulk-import throughput. Most migrations overlook batch endpoints and use single-record APIs, artificially inflating API call counts and timeline estimates.
Request a temporary rate limit increase to 150 req/10sec as insurance, but the batch API approach makes it unnecessary. Your migration becomes predictable, controllable, and completable within the two-week window while maintaining the error handling and data integrity advantages of the API approach.