UC has millions of case records with case history and SLA data. UC’s compliance team would like historical cases to be accessible for 10 years for Audit purpose. What solution should a data architect recommend?
A.
Archive Case data using Salesforce Archiving process
B.
Purchase more data storage to support case object
C.
Use a custom object to store archived case data.
D.
Use a custom Big object to store archived case data.
D.
Use a custom Big object to store archived case data.
Explanation:
Big Objects are specifically designed for massive, long-term data retention like UC's 10-year case history requirement because they: 1) Scale to billions of records; 2) Provide predictable performance; 3) Have lower storage costs; and 4) Support SOQL queries. Salesforce Archiving (A) has limited capacity and doesn't meet the 10-year requirement. Additional storage (B) becomes prohibitively expensive at this scale. Custom objects (C) lack the scalability of Big Objects. Big Objects can store archived Cases with their SLA data while maintaining accessibility for audits through: 1) Defined relationships to standard objects; 2) Indexed fields for efficient querying; and 3) Retention policies. The solution involves: 1) Defining the Big Object schema with frequently queried fields; 2) Creating an archiving process to move eligible Cases; and 3) Implementing access controls for compliance teams. This approach provides the required decade-long retention without impacting production Case management performance. Big Objects are part of Salesforce's enterprise data architecture, designed specifically for compliance scenarios like UC's where historical data must remain accessible but isn't actively worked.