Which solution provides an easy way to ingest Marketing Cloud subscriber profile attributes into Data Cloud on a daily basis?
Correct Answer:
C
The solution that provides an easy way to ingest Marketing Cloud subscriber profile attributes into Data Cloud on a daily basis is the Marketing Cloud Data extension Data Stream. The Marketing Cloud Data extension Data Stream is a feature that allows customers to stream data from Marketing Cloud data extensions to Data Cloud data spaces. Customers can select which data extensions they want to stream, and Data Cloud will automatically create and update the corresponding data model objects (DMOs) in the data space. Customers can also map the data extension fields to the DMO attributes using a user interface or an API. The Marketing Cloud Data extension Data Stream can help customers ingest subscriber profile attributes and other data from Marketing Cloud into Data Cloud without writing any code or setting up any complex integrations.
The other options are not solutions that provide an easy way to ingest Marketing Cloud subscriber profile attributes into Data Cloud on a daily basis. Automation Studio and Profile file API are tools that can be used to export data from Marketing Cloud to external systems, but they require customers to write scripts, configure file transfers, and schedule automations. Marketing Cloud Connect API is an API that can be used to access data from Marketing Cloud in other Salesforce solutions, such as Sales Cloud or Service Cloud, but it does not support streaming data to Data Cloud. Email Studio Starter Data Bundle is a data kit that contains sample data and segments for Email Studio, but it does not contain subscriber profile attributes or stream data to Data Cloud.
References:
✑ Marketing Cloud Data Extension Data Stream
✑ Data Cloud Data Ingestion
✑ [Marketing Cloud Data Extension Data Stream API]
✑ [Marketing Cloud Connect API]
✑ [Email Studio Starter Data Bundle]
A consultant wants to ensure that every segment managed by multiple brand teams adheres to the same set of exclusion criteria, that are updated on a monthly basis. What is the most efficient option to allow for this capability?
Correct Answer:
B
The most efficient option to allow for this capability is to create a reusable container block with common criteria. A container block is a segment component that can be reused across multiple segments. A container block can contain any combination of filters, nested segments, and exclusion criteria. A consultant can create a container block with the exclusion criteria that apply to all the segments managed by multiple brand teams, and then add the container block to each segment. This way, the consultant can update the exclusion criteria in one place and have them reflected in all the segments that use the container block.
The other options are not the most efficient options to allow for this capability. Creating, publishing, and deploying a data kit is a way to share data and segments across different data spaces, but it does not allow for updating the exclusion criteria on a monthly basis. Creating a nested segment is a way to combine segments using logical operators, but it does not allow for excluding individuals based on specific criteria. Creating a segment and copying it for each brand is a way to create multiple segments with the same exclusion criteria, but it does not allow for updating the exclusion criteria in one place.
References:
✑ Create a Container Block
✑ Create a Segment in Data Cloud
✑ Create and Publish a Data Kit
✑ Create a Nested Segment
Northern Trail Outfitters uses B2C Commerce and is exploring implementing Data Cloud to get a unified view of its customers and all their order transactions.
What should the consultant keep in mind with regard to historical data ingesting order data using the B2C Commerce Order Bundle?
Correct Answer:
C
The B2C Commerce Order Bundle is a data bundle that creates a data stream to flow order data from a B2C Commerce instance to Data Cloud. However, this data bundle does not ingest any historical data and only ingests new orders from the time the data stream is created. Therefore, if a consultant wants to ingest historical order data, they need to use a different method, such as exporting the data from B2C Commerce and importing it to Data Cloud using a CSV file12. References:
✑ Create a B2C Commerce Data Bundle
✑ Data Access and Export for B2C Commerce and Commerce Marketplace
Which data stream category should be assigned to use the data for time-based operations in segmentation and calculated insights?
Correct Answer:
B
Data streams are the sources of data that are ingested into Data Cloud and mapped to the data model. Data streams have different categories that determine how the data is processed and used in Data Cloud. Transaction data streams are used for time- based operations in segmentation and calculated insights, such as filtering by date range, aggregating by time period, or calculating time-to-event metrics. Transaction data streams are typically used for event data, such as purchases, clicks, or visits, that have a timestamp and a value associated with them. References: Data Streams, Data Stream Categories
A consultant needs to package Data Cloud components from one organization to another.
Which two Data Cloud components should the consultant include in a data kit to achieve this goal?
Choose 2 answers
Correct Answer:
AD
To package Data Cloud components from one organization to another, the consultant should include the following components in a data kit:
✑ Data model objects: These are the custom objects that define the data model for Data Cloud, such as Individual, Segment, Activity, etc. They store the data ingested from various sources and enable the creation of unified profiles and segments1.
✑ Identity resolution rulesets: These are the rules that determine how data from different sources are matched and merged to create unified profiles. They specify the criteria, logic, and priority for identity resolution2. References:
✑ 1: Data Model Objects in Data Cloud
✑ 2: Identity Resolution Rulesets in Data Cloud