Consumer-Goods-Cloud-Accredited-Professional Practice Test

Salesforce Spring 25 Release -
Updated On 1-Jan-2026

123 Questions

United Telecom is moving its assets to Communications Cloud as part of its digital transformation. During the asset migration process, a Consultant includes a step to create a No change MACD order.

Why is it necessary to have this step in the migration process?

A. To validate if decomposition works on migrated assets

B. To create Inventory Items to be used in subsequent MACD

C. To validate if migrated asset data aligns with asset data model

D. To validate if MACD works on migrated assets

D.   To validate if MACD works on migrated assets

Explanation:
A "No change" MACD (Move, Add, Change, Disconnect) order is a test order that references an existing asset but makes no actual changes to it. The purpose is to validate the integrity and readiness of the migrated assets within the new system's order management framework.

Correct Option:

D. To validate if MACD works on migrated assets
This is the primary reason. Creating a "no change" order tests the end-to-end MACD process on migrated assets:

Can the system correctly decompose the asset into an order?

Do all asset relationships, attributes, and references survive the order decomposition and recomposition cycle?

Does the orchestration plan trigger appropriately?

It confirms that the migrated assets are fully functional for future modifications.

Incorrect Options:

A. To validate if decomposition works on migrated assets
This is part of validating MACD, but it's too narrow. "Decomposition" is just one step in the MACD process. The "no change" order tests the entire MACD workflow, including decomposition, order processing, and recomposition.

B. To create Inventory Items to be used in subsequent MACD
A "no change" order does not create inventory items. Inventory items are typically created from product definitions or during provisioning. This order is a test, not a setup step.

C. To validate if migrated asset data aligns with asset data model
Data model alignment should be verified before or during the data load via data validation rules and mapping. A MACD order tests functional behavior, not static data structure alignment.

Reference:
In Salesforce Communications Cloud migration best practices, a "No Change" MACD order is a standard validation step post-asset migration. It ensures that migrated assets are properly integrated with the Order Management and Orchestration systems and are ready for real-life modification orders. This is covered in asset migration and cutover strategy guides.

Which methodology does Salesforce Maps offer to facilitate the addition of geocoordinates for each retail store location by an admin?

A. Set up a batch job through Salesforce Maps automation to read an address field and add geocoordinates to selected fields,

B. Create a new map layer containing the desired retail store locations and execute the ‘add geocodes' mass action.

C. Click on the individual retail store marker and copy/paste the coordinates from the tooltip to the corresponding retail store longitude and latitude fields.

D. Under geolocation in setting, enable the ‘automate geocoordinate mapping to retail store’ option.

A.   Set up a batch job through Salesforce Maps automation to read an address field and add geocoordinates to selected fields,

Explanation:
Salesforce Maps enables admins to efficiently geocode retail store locations by converting address data into latitude and longitude coordinates. For bulk additions across many stores, the platform's automation features are key. This ensures accurate mapping for route planning, territory management, and location-based analytics in Consumer Goods Cloud without manual intervention, supporting scalability for large retail networks.

Correct Option:

A – Set up a batch job through Salesforce Maps automation to read an address field and add geocoordinates to selected fields.
Batch automation in Salesforce Maps is the recommended method for admins to process multiple records at once. It reads address fields (e.g., Billing Street, City, State) on the retail store object, queries a geocoding API (like Google or HERE), and populates the designated Latitude__c and Longitude__c fields. This runs as a scheduled or on-demand job, handling high volumes without impacting performance.

Incorrect Option:

B – Create a new map layer containing the desired retail store locations and execute the ‘add geocodes' mass action.
While map layers visualize data and mass actions exist for updates like field edits, there is no specific 'add geocodes' mass action tied to layers in Salesforce Maps. Geocoding is handled via on-demand plotting or batch jobs, not layer-based mass actions, which could lead to incomplete or manual processing.

C – Click on the individual retail store marker and copy/paste the coordinates from the tooltip to the corresponding retail store longitude and latitude fields.
This manual approach is feasible for small datasets but impractical for admins managing many stores. It requires plotting each record individually, lacks automation, and risks errors from copy-paste, violating efficiency best practices for bulk geocoordinate addition.

D – Under geolocation in setting, enable the ‘automate geocoordinate mapping to retail store’ option.
Salesforce Maps settings include base object configuration for geocoding fields, but no toggle named ‘automate geocoordinate mapping to retail store’ exists. Automation is achieved through explicit batch job setup or flows, not a simple enablement option.

Reference:
Salesforce Trailhead → “Improve Location Data Accuracy in Salesforce Maps” (Configuring Your Salesforce Maps Environment Module)

Salesforce Help → “Batch Automation for Geocoding Records in Maps”

Salesforce Maps Developer Guide → “Get the Geographical Coordinates of Addresses in Batch”

Universal Containers is using Communications Cloud for their B2B use cases. They have an integration with a legacy stack that will handle network provisioning and billing. As part of their Order Management process they have to send the customer data to the legacy app, which in turn provisions billing.

What should a Consultant recommend to make this callout easier to configure, easier to maintain, and performant?

A. Create a custom integration adapter to fetch the customer info and pass it to the payload that will be sent to the external application.

B. Model the customer data as fields on Order and pass the fields along with the other attributes to the payload.

C. Model the customer data to Technical Products along with other Products and Services and create Decomposition relationships accordingly to send the right information within the callout tasks.

D. Model the customer data as multi picklist attributes within the cart and create Decomposition relationships accordingly to send the right information within the callout tasks.

B.   Model the customer data as fields on Order and pass the fields along with the other attributes to the payload.

Explanation:
The requirement is to send customer data to a legacy application during the Order Management process, specifically for network provisioning and billing setup. Customer data (like customer ID, account number, billing address) is account-level information that is static for the entire order, not service-specific. The most straightforward, performant, and maintainable way to pass this information is to extract it once and map it directly from the parent Order record to the external system payload, typically using a DataRaptor.

✅ Correct Option:

B. Model the customer data as fields on Order and pass the fields along with the other attributes to the payload.
Easier to Configure and Maintain: Customer data already resides on the related Account and Order objects. The consultant can simply map these fields directly from the Order header (or its related Account) to the payload using an Integration Procedure/DataRaptor. This avoids complex data modeling on the product side and is highly visible and easy to manage.

Performant: Extracting the data once from the single Order record is much more efficient than trying to embed and extract it across potentially hundreds of technical line items. Since the data is constant for the entire order, repeating the data extraction or modeling logic is unnecessary overhead.

❌ Incorrect Options:

A. Create a custom integration adapter to fetch the customer info and pass it to the payload that will be sent to the external application.
This introduces custom Apex code (a custom adapter) to fetch simple customer information, which is a high-cost, high-maintenance solution. Industries Cloud provides standard low-code tools like DataRaptors and Integration Procedures which are designed for this exact purpose, making a custom adapter unnecessary and harder to maintain.

C. Model the customer data to Technical Products along with other Products and Services and create Decomposition relationships accordingly to send the right information within the callout tasks.
Customer data is not a product or service attribute; it's account-level metadata. Modeling customer data onto Technical Products would force the data to be repeated on every technical line item generated from the decomposition process. This is inefficient, increases storage usage, and makes extraction more complex, as you'd have to ensure consistency across all line items.

D. Model the customer data as multi picklist attributes within the cart and create Decomposition relationships accordingly to send the right information within the callout tasks.
Modeling simple, single-value customer metadata (like an account number) as a multi-picklist attribute is incorrect and overly complex. Multi-picklists are designed for selecting multiple, non-mutually exclusive values. Furthermore, forcing this data through the decomposition relationship adds unnecessary complexity, as outlined in option C.

📖 Reference:
Salesforce Communications Cloud Integration Patterns: Guidance on leveraging standard data extraction tools (DataRaptors) to build callout payloads. Best practice suggests extracting order-level and account-level metadata from the Order/Account header records for integration tasks, rather than embedding it within product or service line items.

An Organization wants to maintain data related to the line items and assets in custom objects under the line items (Object 'A') and assets (Object Name 'B').

What will ensure the data is saved under assets during assetization and can be leveraged for MACD Orders?

A. Write an APEX Hook Class during Checkout and AssetToOrder for creating the records as a post step on the API.

B. Use Object Mapper to Map the line item object from A to B.

C. Use Field Mapper to map fields from Object A to B and another mapping from Object B to A.

D. Use Object Mapper to map the line item object from A to B and another mapping from Object B to A.

D.   Use Object Mapper to map the line item object from A to B and another mapping from Object B to A.

Explanation:
In Salesforce Industries (Communications Cloud), assetization is the process of creating Asset records from sold products. MACD orders involve modifying existing assets. To pass custom data between the QLI context (Object A) and the Asset context (Object B), you need a bidirectional mapping that works during both asset creation and later modifications.

Correct Option:

D. Use Object Mapper to map the line item object from A to B and another mapping from Object B to A.
This is the declarative, supported method.

Object Mapper is a tool in Salesforce Industries that defines how fields map between different objects during processes like assetization and order decomposition.

Mapping A → B ensures custom data flows from the QLI custom object (A) to the Asset custom object (B) during asset creation.

Mapping B → A ensures that when a MACD order is created from an existing asset, the data flows back from the Asset (B) to the new QLI (A) for modification.

This two-way mapping is essential for MACD scenarios.

Why Others Are Incorrect:

A. Write an APEX Hook Class during Checkout and AssetToOrder for creating the records as a post step on the API.
This is a custom code solution. While it might work, it is not the recommended best practice when a declarative tool (Object Mapper) exists. It also adds maintenance overhead and risk. The question implies a standard solution.

B. Use Object Mapper to Map the line item object from A to B.
This is only one-way (A → B). It would work for initial asset creation but would fail for MACD orders. When modifying an existing asset, data from B needs to populate the new line item (A), requiring the reverse mapping.

C. Use Field Mapper to map fields from Object A to B and another mapping from Object B to A.
Field Mapper is used for field-level transformations within a single object mapping, not for creating separate object-to-object mappings. The correct tool for object-to-object mapping is Object Mapper.

Reference:
In Salesforce Industries Order Management, Object Mapper is the declarative component used to define data transformations between objects (e.g., Quote Line Item → Asset, Asset → Order Product) during processes like assetization and order decomposition for MACD. Bidirectional mappings are standard for MACD support. This is covered in the "Data Mapping and Transformation" guides for Industries implementations.

A communications company wants to improve their quote-to-order journey experience. The journey has several steps, which include selecting products and services, and integration with the inventory system for device reservation. They want to create a modern, multi-channel experience.

What approach should a Consultant take during planning to ensure optimal development and time to market?

A. Knowing the exact data exchanged in integration is an input to the UX design. Detailed design of the integration step is required before UX design can start.

B. UX experience is the most important. Fully design and validate the UX before designing the integration step.

C. Plan for three user stories running sequentially: UX Design first, Device Reservation API second, and Inventory System Integration last.

D. Plan for three user stories running in parallel: UX Design, Device Reservation API, and Inventory System Integration. UX only requires the API information to be complete.

D.   Plan for three user stories running in parallel: UX Design, Device Reservation API, and Inventory System Integration. UX only requires the API information to be complete.

Explanation:
For a modern, multi-channel project involving both user experience (UX) and back-end integration (API and Inventory System), a parallel development approach is critical for speed. The UX design team and the integration team can work concurrently. The UX team needs the API specifications (endpoints, data structure, expected response times) to be defined so they can design the user interface components (e.g., loading spinners, error messages, fields for device selection) that interact with those APIs, but they don't need the integration implementation to be complete.

✅ Correct Option:

D. Plan for three user stories running in parallel: UX Design, Device Reservation API, and Inventory System Integration. UX only requires the API information to be complete.
Parallelism: Running the three distinct work streams (UX Design, API Definition, and Inventory Integration) simultaneously significantly reduces the overall time-to-market compared to a sequential approach.

Decoupling: The UX team needs the contract (the information) of the Device Reservation API (i.e., the data model and endpoints), not the fully built back-end integration itself. They can design and build the front-end components, often using mock data, based on the API specification.

Efficiency: This is the standard Agile approach for complex projects, ensuring all parts of the solution are developed efficiently and integrated in later sprints.

❌ Incorrect Options:

A. Knowing the exact data exchanged in integration is an input to the UX design. Detailed design of the integration step is required before UX design can start.
This approach is sequential and slow. UX only requires the API specification (the contract), not the detailed implementation of the Inventory System Integration. Waiting for the full integration design will unnecessarily delay the UX work.

B. UX experience is the most important. Fully design and validate the UX before designing the integration step.
This is inefficient and risky. The integration capabilities and data model are a hard constraint on the UX. Designing the UX in a vacuum without knowing what data the APIs can provide or how long they will take to respond can lead to designs that are technically unfeasible, requiring costly rework later.

C. Plan for three user stories running sequentially: UX Design first, Device Reservation API second, and Inventory System Integration last.
This is a waterfall/sequential approach which sacrifices time-to-market. While logically ordered, it is not the optimal development strategy. The API and Integration work should start alongside the UX work to maximize efficiency.

📖 Reference:
Agile Software Development and Systems Integration Best Practices: Principles of parallel development, API-first design, and defining interface contracts early to enable simultaneous work on front-end (UX/UI) and back-end (integration/API) systems.

Consumer-Goods-Cloud-Accredited-Professional Exam Questions - Home Previous
Page 3 out of 25 Pages