Salesforce-Marketing-Cloud-Engagement-Consultant Exam Questions With Explanations

The best Salesforce-Marketing-Cloud-Engagement-Consultant practice exam questions with research based explanations of each question will help you Prepare & Pass the exam!

Over 15K Students have given a five star review to SalesforceKing

Why choose our Practice Test

By familiarizing yourself with the Salesforce-Marketing-Cloud-Engagement-Consultant exam format and question types, you can reduce test-day anxiety and improve your overall performance.

Up-to-date Content

Ensure you're studying with the latest exam objectives and content.

Unlimited Retakes

We offer unlimited retakes, ensuring you'll prepare each questions properly.

Realistic Exam Questions

Experience exam-like questions designed to mirror the actual Salesforce-Marketing-Cloud-Engagement-Consultant test.

Targeted Learning

Detailed explanations help you understand the reasoning behind correct and incorrect answers.

Increased Confidence

The more you practice, the more confident you will become in your knowledge to pass the exam.

Study whenever you want, from any place in the world.

Salesforce Salesforce-Marketing-Cloud-Engagement-Consultant Exam Sample Questions 2025

Start practicing today and take the fast track to becoming Salesforce Salesforce-Marketing-Cloud-Engagement-Consultant certified.

22934 already prepared
Salesforce Spring 25 Release
293 Questions
4.9/5.0

A customer wants to limit the number of emails a subscriber receives to a maximum of one email every 14 days. After the 14-day period, the subscriber is eligible to receive the next message. What should a consultant recommend to meet this criteria?

A. Import the identified subscribers into a list when creating the send.

B. Create an exclusion data extension populated with the identified subscribers

C. Query contacts from the Einstein Engagement Frequency data extension when creating the send

D. Create a suppression list populated with the identified subscribers.

B.    Create an exclusion data extension populated with the identified subscribers

Explanation:

Why B is Correct
To enforce a rule where subscribers can receive a maximum of one email every 14 days, the most effective solution is to use an exclusion data extension. This DE can be populated (via a SQL Query or Automation Studio process) with subscribers who have received an email within the last 14 days. When creating a send, this exclusion DE is applied so that those subscribers are automatically excluded from the audience. After the 14‑day period has passed, they will no longer appear in the exclusion DE and will be eligible to receive the next message. This approach is flexible, scalable, and aligns with Marketing Cloud best practices for frequency capping.

Why the Other Options Are Incorrect

A. Import the identified subscribers into a list when creating the send: This is a manual process and not scalable. It requires human intervention for every send, which is error‑prone and inefficient compared to automated exclusion DEs.

C. Query contacts from the Einstein Engagement Frequency data extension when creating the send: Einstein Engagement Frequency provides predictive insights into optimal send frequency but does not enforce hard rules like “one email every 14 days.” It is advisory, not prescriptive, so it cannot guarantee compliance with the requirement.

D. Create a suppression list populated with the identified subscribers: Suppression lists are designed for permanent exclusions (e.g., competitors, unsubscribes, or compliance requirements). They are not intended for temporary exclusions based on time windows. Using a suppression list would block subscribers indefinitely rather than allowing them to re‑enter after 14 days.

References
Salesforce Help: Exclusion Data Extensions

A financial company wans to use Marketing Cloud to send late payment notices to accounts whose payment due date lapsed the previous week. The company has shared the following:

Payment.csv will arrive on the Enhanced SFTP each Monday at 1 a.m.
Payments.csv will be encrypted.
Payments.csv will contain data from the previous week.
Late payment notices will be sent each Monday at noon.
They need to receive a file containing customers who opened or clicked on the late payment notice email within five days after send.

Which automation sequence represents a viable solution?

A. File Transfer > Import File > Filter > Wait > Send Email > SQL Query > Wait > Data
Extract > File Transfer

B. Import File > File Transfer > SQL Query > Wait > Send Email > Wait > SQL Query >
Data Extract > File Transfer

C. File Transfer > File Transfer > Import File > SQL Query > Wait > Send Email > SQL
Query > File Transfer

D. File Transfer > Import File > Filter > Wait > Send Email > Wait > SQL Query > Data Extract > File Transfer

D.   File Transfer > Import File > Filter > Wait > Send Email > Wait > SQL Query > Data Extract > File Transfer

Explanation:

📘 Why Option D Is Correct
Let’s map the requirements step by step:

Payments.csv arrives encrypted on Enhanced SFTP at 1 a.m. → File Transfer activity is required first to decrypt and prepare the file.

Payments.csv must be imported into a Data Extension (Shipping Notice DE). → Import File activity loads the data into the target DE.

Segment accounts whose payment due date lapsed the previous week. → Filter Activity isolates the correct audience from the imported data.

Send late payment notices each Monday at noon. → Wait Activity ensures the send occurs at the correct time (noon). → Send Email Activity delivers the late payment notices.

Track customers who opened or clicked within 5 days after send. → Wait Activity (5 days) allows time for engagement. → SQL Query Activity queries the tracking data views (_Open, _Click) to identify engaged customers.

Provide a file of engaged customers. → Data Extract Activity creates the export file. → File Transfer Activity moves the file back to SFTP for delivery.

❌ Why Other Options Are Incorrect
A. File Transfer > Import File > Filter > Wait > Send Email > SQL Query > Wait > Data Extract > File Transfer
Incorrect order: SQL Query runs before the 5-day wait, so engagement data wouldn’t be captured properly.

B. Import File > File Transfer > SQL Query > Wait > Send Email > Wait > SQL Query > Data Extract > File Transfer
Wrong order: File Transfer should occur before Import (to decrypt).

SQL Query before send makes no sense.

C. File Transfer > File Transfer > Import File > SQL Query > Wait > Send Email > SQL Query > File Transfer
Redundant double File Transfer.
SQL Query before send is invalid.

🔗 References
Salesforce Help: Automation Studio Activities
Trailhead: Marketing Cloud Automation Studio Basics
Salesforce Documentation: Data Views for Tracking (Open, Click)

A customer will provide a single daily file on the Marketing Cloud SFTP at 3 AM and needs an alert if the file is not present on time. The file needs to be: Imported into a staging data extension. Separated into two different data extensions. Which workflow should meet these requirements?

A. Scheduled Automation: File Transfer Activity > Import File Activity > SQL Query Activity 1 > SQL Query Activity 2

B. Scheduled Automation: Import File Activity > SQL Query Activity 1 > SQL Query Activity 2

C. File Drop Automation: Import File Activity > SQL Query Activity 1 > SQL Query Activity 2

D. File Drop Automation: File Transfer Activity > Import File Activity > Filter Activity > SQL Query Activity 1

B.    Scheduled Automation: Import File Activity > SQL Query Activity 1 > SQL Query Activity 2

Explanation:

Scheduled Automation
Since the file is expected daily at 3 AM, a Scheduled Automation is the right choice.
File Drop Automation would only trigger when the file arrives, but it would not provide an alert if the file is missing. Scheduled Automation can be monitored for failures, which satisfies the alert requirement.

File Transfer Activity
Ensures the file is moved from the SFTP location into Marketing Cloud for processing.

Import File Activity
Loads the file into the staging data extension.

SQL Query Activities (1 & 2)
Split the staging data into two different data extensions based on business rules.

❌ Why the other options are incorrect:
B. Scheduled Automation without File Transfer
Missing the File Transfer Activity, which is required to move the file from SFTP before import.

C. File Drop Automation
Triggers only when the file arrives.
Does not provide an alert if the file is missing at 3 AM.

D. File Drop Automation with Filter Activity
Same issue as C (no alert if missing).
Also incorrectly uses a Filter Activity instead of SQL Queries to split data into two extensions.

🔗 References:
Salesforce Help: Automation Studio Activities
Trailhead: Marketing Cloud Automation Basics

A customer wants to automate a series of three emails as part of a Membership renewal drip campaign.

Email #1 will be sent one month prior to the member's renewal date
Email #2 will be sent one week prior to the member's renewal date
Email #3 will be sent on the member's renewal date
A master audience is updated in real time via the API

Which steps should be included in the customer's automation?

A. Import activity -& Three filter activities -& Three send definitions to the filtered audiences

B. Three send definitions to the master data extension

C. Import activity -& Three send definitions to the master data extension

D. Three filter activities -& Three send definitions to the filtered audiences

D.   Three filter activities -& Three send definitions to the filtered audiences

Explanation:

Here’s why:

You have:
A master audience Data Extension, updated in real time via the API

Three emails based on relative time to Renewal Date:

Email 1: 1 month before
Email 2: 1 week before
Email 3: On the renewal date

You want a recurring/scheduled Automation Studio process that, each day:
Identifies who should get which email that day, based on the renewal date.
Sends the correct email to those people.

Why D is correct

Three Filter Activities
Filter 1: members whose renewal date = Today + 30 days → audience for Email #1
Filter 2: members whose renewal date = Today + 7 days → audience for Email #2
Filter 3: members whose renewal date = Today → audience for Email #3

Each filter produces a filtered Data Extension (or filtered audience) containing only the members who should receive that specific email on that run.

Three Send Definitions
Each filtered DE then feeds into its corresponding Send Definition:
Filtered DE #1 → Send Email #1
Filtered DE #2 → Send Email #2
Filtered DE #3 → Send Email #3

The automation can run daily, and the filters will always pull the correct audience based on the dates.

Why the other options are wrong

A. Import activity - Three filter activities - Three send definitions
The master audience is already updated via API in real time. No need for an Import Activity in the automation.

B. Three send definitions to the master data extension
This would send all three emails to everyone in the master DE, regardless of renewal date — not date-driven targeting.

C. Import activity - Three send definitions to the master data extension
Same problem as B (no segmentation by date), plus unnecessary Import.

So the correct steps for the automation are:
D. Three filter activities followed by three send definitions to the filtered audiences. ✅

Northern Trail Outfitters (NTO) has been storing web behavior to a data extension for several years. They have indicated with several hundred millions of rows there has been an impact on performance. NTO indicates they only need to store data from the previous twelve months which will not exceed eighty million rows. Which two methods would allow them to utilize a Retention Policy? (Choose 2 answers)

A. Clear data from the current data extension completely, then reconfigure a Retention Period via Email Studio.

B. Delete data from the data extension prior to twelve months ago, then configure a Retention Period via Contact Builder.

C. Reconfigure the current data extension as-is with a Retention Period via Contact Builder.

D. Replace the current data extension with a new data extension configured with a Retention Period.

B.    Delete data from the data extension prior to twelve months ago, then configure a Retention Period via Contact Builder.
D.    Replace the current data extension with a new data extension configured with a Retention Period.

Explanation:

Retention Policies in Marketing Cloud
Retention Policies allow you to automatically remove data from a data extension after a defined period of time. However, they only apply to new data added after the policy is configured. Existing data older than the retention period must be manually removed before the policy can take effect.

Option B ✅
If NTO deletes all rows older than twelve months, they can then configure a Retention Policy in Contact Builder to ensure that only data within the last twelve months is retained going forward. This approach cleans up the current data extension and prevents future performance issues.

Option D ✅
Another valid approach is to create a new data extension with a Retention Policy already configured. This ensures that only data within the defined retention window is stored. The old data extension can be retired once the new one is in use.

Why Not the Other Options

A. Clear data from the current data extension completely, then reconfigure via Email Studio
Retention Policies cannot be configured in Email Studio; they must be set in Contact Builder. Clearing data alone does not solve the issue unless retention is properly configured.

C. Reconfigure the current data extension as-is with a Retention Period via Contact Builder
Simply adding a retention policy does not remove existing data older than twelve months. Since NTO already has hundreds of millions of rows, this would not resolve the performance issue.

📘 References
- Salesforce Help: Data Retention Policies in Marketing Cloud
- Trailhead: Contact Builder in Marketing Cloud module

Prep Smart, Pass Easy Your Success Starts Here!

Transform Your Test Prep with Realistic Salesforce-Marketing-Cloud-Engagement-Consultant Exam Questions That Build Confidence and Drive Success!