Salesforce-Platform-Integration-Architect Practice Test
Updated On 1-Jan-2026
106 Questions
Universal Containers (UC) is a leading provider of management training globally, UC requested students course registration data generated from the Salesforce student community to be synced with the learning management system (LMS). Any update to the course registration data needs to be reflected in the LMS. Which integration mechanism should be used to meet the requirement?
A.
Change Data Capture (CDC)
B.
Platform Event
C.
Streaming API
D.
Outbound Message
Change Data Capture (CDC)
Explanation:
The core requirement is to sync any updates to Salesforce data (course registration) with an external system (LMS) in near real-time. This is a classic data replication/synchronization scenario. Change Data Capture (CDC) is the modern, highly scalable, and native mechanism provided by Salesforce to stream changes (create, update, delete, undelete) to standard and custom records to external subscribers, which perfectly matches the need for the LMS to receive every data update automatically.
Correct Option: ✅
A. Change Data Capture (CDC)
Change Data Capture (CDC) is the most appropriate mechanism for this requirement. It provides a stream of transactional changes for specified Salesforce objects (like the Course Registration object) after a database commit. This means the LMS can simply subscribe to the CDC stream and automatically receive event messages containing the old and new values of the changed records, ensuring near real-time data synchronization without needing complex Apex triggers or custom event publishing logic.
Incorrect Option: ❌
B. Platform Event
Platform Events are designed for custom business events (e.g., "Order Placed" or "Registration Submitted") that are decoupled from the database transaction. They are ideal for process orchestration or notifying systems of an action, but an administrator must manually publish them using a Flow or Apex. CDC is preferred over manually publishing Platform Events when the goal is generic, reliable, record-level data synchronization based on any database change.
C. Streaming API
The Streaming API is a general term that encompasses all event-based subscription technologies in Salesforce, including PushTopic, Platform Events, and Change Data Capture. It's the API used to consume the stream, not the mechanism for generating the required detailed data change events. While the LMS would use a Streaming API client, CDC is the specific and correct underlying technology that generates the needed change data.
D. Outbound Message
Outbound Messages are a legacy feature (Workflow Rule action) that uses the SOAP API to send a notification to a specific external endpoint. They are limited in the number of fields they can send, do not guarantee delivery (requiring the external system to expose a WSDL), and are not the recommended solution for new integrations due to their complexity, lower scalability, and reliance on older technology. Salesforce has been advising architects to use CDC or Platform Events instead.
Reference:
Salesforce Change Data Capture Documentation
"Change Data Capture enables you to receive notifications of Salesforce data changes in real time, and synchronize records in Salesforce with an external data store... For new integrations that require near-real-time data synchronization, we recommend using Change Data Capture."
Change Data Capture
Integration Patterns
Northern Trail Outfitters (NTO) has an affiliate company that would like immediate notifications of changes to opportunities in the NTO Salesforce instance. The affiliate company has a CometD client available. Which solution is recommended in order to meet the requirement?
A.
Create a connected app in the affiliate org and select the "Accept CometD API Requests".
B.
A Implement a polling mechanism in the client that calls the SOAP API getupdated
method to get the ID values of each updated record.
C.
Configure External Services to call the subscriber in Apex in the Onchange Trigger event as part of the flow.
D.
Create a PushTopic update event on the Opportunity Object to allow the subscriber to react to the streaming API.
Create a PushTopic update event on the Opportunity Object to allow the subscriber to react to the streaming API.
Explanation
This scenario requires real-time change notifications for Opportunity records. Because the affiliate already uses a CometD client, the most efficient and native Salesforce solution is the Streaming API. PushTopic events are specifically designed to notify subscribers instantly when records change. This avoids the delays, complexity, and API usage of polling or custom Apex. PushTopics provide immediate, event-driven updates perfectly suited for CometD subscribers.
✔️ Correct Option
D. Create a PushTopic update event on the Opportunity Object
PushTopic Events work directly with Salesforce’s Streaming API, which uses the Bayeux protocol supported by CometD. By creating a PushTopic that tracks Opportunity updates, the affiliate can subscribe and receive immediate notifications when changes occur. This solution is lightweight, built-in, and optimized for real-time data delivery without custom code or excessive API consumption.
❌ Incorrect Options
A. Create a connected app and select “Accept CometD API Requests”
No such configuration exists in Salesforce Connected Apps. Connected Apps are for OAuth and permission management only; they cannot enable or configure Streaming API channels. Even with authentication in place, this option does not generate real-time notifications or PushTopic subscriptions.
B. Implement a polling mechanism using SOAP API getUpdated
Using getUpdated is a polling method, which means updates are delivered only when the external system checks for them. This is not instant and consumes unnecessary API calls. Polling introduces latency and is significantly less efficient than using Salesforce’s native event-driven Streaming API.
C. Configure External Services with Apex OnChange Trigger
External Services allow Salesforce to call external APIs, not to deliver real-time notifications via CometD. Adding triggers and flows introduces complexity and still does not provide native streaming capabilities. Salesforce’s Streaming API already solves the requirement without custom logic, making this option unsuitable.
Reference
PushTopic Events (Legacy)
A company that is a leading provider of courses and training delivers courses using third party trainers. The trainer for the company has to be verified from 10 different training accreditation verification agencies before providing training for the company. Each training accreditation agency has its own response time, which could take days to confirm a trainer. The company decided to automate the trainer accreditation verification process by integrating to the agencies web services. What is the recommended approach to automate this process?
A.
Use salesforce external service to make the call out, Salesforce external service should check the verification agencies until the result is verified, then update the trainer status to "verified".
B.
Create a trigger on the trainer record to make a Callout to each verification agencies, write business logic to consolidate the verification then update the trainer status to verified".
C.
Make an apex callout using @future annotation to make the call out to all differentagencies. The response should update the trainer status to "verified".
D.
Use middleware to handle the call out to the 10 different verification services, the middleware will handle the business logic of consolidating the verification result from t 10 services, then make a call-in to sa lesforce and update the verification status to "verified".
Use middleware to handle the call out to the 10 different verification services, the middleware will handle the business logic of consolidating the verification result from t 10 services, then make a call-in to sa lesforce and update the verification status to "verified".
Explanation
This integration challenge involves coordinating with 10 external systems that have variable response times spanning days. The core requirement is handling long-running, asynchronous processes that are beyond Salesforce's execution time limits. The solution must reliably manage multiple external calls, consolidate results, and update Salesforce without hitting platform constraints.
✔️ Correct Option
(D) ✅ Use middleware to handle the call out... Middleware is ideal for orchestrating complex, long-running integrations across multiple systems. It can manage the 10 different verification calls asynchronously, handle response times that take days, implement retry logic for failures, and consolidate all results before making a single call back to Salesforce. This approach avoids Salesforce governor limits for long-running transactions and provides robust error handling.
❌ Incorrect Options
(A) 🚫 Use salesforce external service... While External Services simplifies callouts, it still operates within Salesforce transaction limits. It cannot handle response times that take days and would timeout. Continuously polling agencies from Salesforce is inefficient and would quickly hit API limits.
(B) 🚫 Create a trigger... to make a Callout... This approach is fundamentally flawed because triggers cannot make direct callouts. Even with asynchronous processing, coordinating 10 separate verifications from within Salesforce would be complex and unreliable given the potentially days-long response times.
(C) 🚫 Make an apex callout using @future... @future methods have a maximum execution time of 5 minutes, making them completely unsuitable for processes that might take days to complete. The platform would timeout long before receiving responses from the verification agencies.
📚 Reference
The official Salesforce Architecture resources recommend middleware for "long-running processes involving multiple external systems" where response times exceed platform limits. This aligns with the enterprise integration pattern of using an external orchestration layer to manage complex, multi-system workflows.
An Integration Architect has built a solution using REST API, updating Account, Contact, and other related information. The data volumes have increased, resulting in higher API calls consumed, and some days the limits are exceeded. A decision was made to decrease the number of API calls using bulk updates. The customer prefers to continue using REST API to avoid architecture changes. Which REST API composite resources should the Integration Architect use to allow up to 200 records in one API call?
A.
SObject Collections
B.
SObject Tree
C.
Batch
D.
Composite
SObject Collections
Explanation
Data growth has spiked API calls for Account/Contact updates, hitting daily limits. The goal is to slash call volume using REST API only (no redesign). A composite resource must bundle up to 200 records in one request for bulk updates — reducing consumption dramatically while keeping the current architecture intact.
✅ Correct Option: A. SObject Collections
Enables bulk processing of up to 200 records (same or different sObject types) in one REST call.
Uses POST /composite/sobjects with a payload array; supports allOrNone for atomicity.
Perfect for flat, non-hested updates — directly replaces 200 individual calls.
Stays fully within REST; no Bulk API or schema changes needed.
❌ Incorrect Option: B. SObject Tree
Built for parent-child record creation (e.g., Account with nested Contacts).
Supports max 200 records but requires hierarchical structure — not suitable for unrelated bulk updates.
Forces unnecessary nesting; inefficient and incorrect for this flat-data use case.
❌ Incorrect Option: C. Batch
No such thing as a "Batch" composite resource in REST API.
"Batch" typically refers to Bulk API jobs, not REST — violates the "continue using REST" requirement.
Misleading option; confuses REST composites with async bulk processing.
❌ Incorrect Option: D. Composite
The Composite resource allows up to 25 subrequests (not 200 records).
Each subrequest is treated as a separate operation — doesn’t achieve true 200-record bulk.
Designed for transactional multi-step logic, not high-volume bulk updates.
📚 Reference
Salesforce REST API – SObject Collections
Introduction to REST API
Northern Trail Outfitters (NTO) uses Salesforce to track leads, opportunities and order details that convert leads to customers. However, Orders are managed by an external (remote) system. Sales representatives want to view and update real-time order information in Salesforce. NTO wants the data to only persist in the external system. Which type of Integration should an architect recommend to meet this business requirement?
A.
Data Visualization
B.
Data Synchronization
C.
Process Orchestration
D.
Batch Processing
Data Visualization
Explanation
The core requirement is to allow users to view and update data in Salesforce that is physically stored only in an external system. This demands a real-time, virtual integration where Salesforce acts as a UI layer, but the external system remains the single source of truth. The solution must support two-way, immediate communication without replicating and storing the order data within the Salesforce database.
✅ Correct Option
A. Data Visualization
This is the correct pattern. Data Visualization involves displaying data from an external system in Salesforce UI in real-time without storing it in the Salesforce database. Tools like Salesforce Canvas or direct UI-based callouts can be used to embed the external order management application's interface directly within a Salesforce page. This allows reps to view and update orders, with all data persistence and logic handled by the remote system, fulfilling the requirement perfectly.
❌ Incorrect Options
B. Data Synchronization
This pattern involves copying and storing data in two or more systems to keep them consistent. It directly violates the key requirement that data "only persist in the external system," as it would require creating and storing Order objects in Salesforce, which would then be synchronized with the external system.
C. Process Orchestration
This pattern focuses on coordinating a long-running business process across multiple systems. While an orchestration might use the real-time visualization, the pattern itself is about managing the process flow, not the specific UI requirement for viewing and updating data without persistence.
D. Batch Processing
This pattern is for moving large volumes of data at scheduled intervals (e.g., nightly). It does not support the "real-time" requirement for sales representatives to view and update information immediately while interacting with a customer.
📚 Reference
For official guidance, refer to the Salesforce Integration Patterns & Practices documentation on the Salesforce Architect website. The "Data Visualization" pattern is specifically described as a method for displaying and interacting with external data in the Salesforce UI without storing it locally, which aligns with the requirement for the external system to be the sole system of record.
| Salesforce-Platform-Integration-Architect Exam Questions - Home | Previous |
| Page 3 out of 22 Pages |