We’re implementing a storage tiering strategy for our ERP system’s historical data. We have 7 years of compliance retention requirements for financial records, audit logs, and transaction history - about 80TB currently and growing 15TB annually.
Debating between keeping everything in S3 Standard with lifecycle policies versus aggressive tiering to Glacier/Deep Archive. The compliance team needs occasional access to records (maybe 20-30 retrievals per month), but most data sits untouched for years.
S3 Standard costs add up at $1,840/month for 80TB, while Glacier Deep Archive would be $80/month storage but $0.02/GB retrieval. With unpredictable access patterns for audits, worried about surprise retrieval bills. What storage tiering approaches have worked for others with similar compliance-driven archival needs?
We implemented a three-tier strategy: S3 Standard for current year, S3 Glacier Instant Retrieval for years 2-3, and Glacier Deep Archive for years 4-7. This balances cost with retrieval speed for audits. Most audit requests hit the recent data anyway. The instant retrieval tier costs more than Deep Archive but eliminates the 12-hour wait for urgent compliance requests.
Consider compliance requirements beyond just retention duration. Some regulations specify retrieval time SLAs. We had to keep certain audit categories in S3 Standard despite the cost because regulators required 1-hour access guarantees. Review your specific compliance framework - SOX, GDPR, industry regulations - they often dictate tiering decisions more than cost optimization. You might need a hybrid approach where different data types go to different tiers based on regulatory retrieval requirements.
The S3 Intelligent-Tiering idea is interesting, but doesn’t it have monitoring fees that might offset savings? Also, our compliance team sometimes needs same-day access for regulatory audits. Would Glacier Instant Retrieval handle that, or do we need to keep more in Standard tier?