Configure the Azure Blob Destination
Learn how to set up a connection and send data to the Azure Blob Storage destination in Real-Time Customer Data Platform. This destination supports exporting datasets and audiences, and allows you to customize the file headers and data attributes.
For more information, please visit the documentation.
Transcript
In this video, I鈥檒l show you how to configure a connection to the Microsoft Azure Blob destination and then send data using the Experience Platform user interface. Let鈥檚 get started with establishing the connection first. I鈥檓 logged into Experience Platform and selected Destinations below Connections. This opens the Catalog view. Next I鈥檒l scroll to the Categories section and select Cloud Storage. Here is the destination card for the Azure Blob storage. Since I don鈥檛 have any previous connections saved, I鈥檒l select Setup. This opens the workflow beginning with configuring the connection through sending data to the storage account. The first thing I need to do is enter the connection string. I鈥檒l quickly switch to a slide that shows you where to get this from the Azure portal once you have a storage account configured. When you open the storage account, select Access Keys under Security & Networking. Show the connection string and copy it. It starts with default endpoints protocol and make sure you copy the entire string. Now I鈥檒l paste this string into the text field in the workflow. Adding an encryption key to attach to your exported files is optional. I won鈥檛 be doing this for my demonstration, but I highly recommend you do. You can install open source tools that will take care of creating public and private keys, but it鈥檚 the public key value that鈥檚 copied here. To finish the connection, I鈥檒l click on Connect to Destination. The connection is now confirmed. You鈥檇 receive an error message if it鈥檚 unsuccessful. Additional fields now appear on this configuration view. These are the storage account destination details. This destination supports sending datasets, prospects, and audiences from Experience Platform. There鈥檚 a separate video about sending datasets if you鈥檙e interested in learning about that. I鈥檒l be sending an audience for my demo, so I鈥檒l keep that selection. Then I鈥檒l fill in the name, description, folder path, and container fields. The last two fields specify the destination folder and container from the storage account. You can get these values from the Azure portal. For the field type, I have two options, JSON and Parquet. I鈥檒l choose JSON. Towards the bottom, I can choose a compression format, which I鈥檒l do now. The choice is relative to the file type selected above. If you鈥檙e working with the production sandbox, you鈥檒l have the option to set up alerts. I鈥檓 finished with these inputs, so I鈥檒l select Next in the top right to move forward. This step of Configure New Destination is prompting me to select the marketing action appropriate for this connection. I鈥檒l choose Email Targeting from the list and then Next at the top. Because I chose audiences in the destination detail step, I鈥檓 presented with the list of audiences in my Experience Platform sandbox. If you have a lot of audiences in your sandbox, you can use the search bar to filter the list. Since I鈥檓 using a development sandbox, I don鈥檛 have many. I鈥檒l choose the Luma customers with level gold or above, then I鈥檒l select Next to move forward. On the Scheduling step, I can specify whether I鈥檓 exporting a new audience, or audiences, or those that have already been activated. Let鈥檚 review the scheduling options. Here you can decide whether you want to export the full file as a one-time operation, or export incremental files that contain new data for people who become part of the audience over time. Next, I can choose a frequency setting and a start time. You can specify a precise time or send the data after Experience Platform processes segment evaluation. This option ensures that the most up-to-date audience profiles are exported after the daily batch segmentation job finishes. I could customize the start date as well. You can modify the file name shared to the storage account. The first item is a file name preview, and below that there鈥檚 a lot of flexibility to append settings to the file name, or choose a different date and time setting to append, or add custom text. I鈥檓 going to keep the standard file name for my demo, so I鈥檒l cancel out of this step. Once I鈥檓 done with the scheduling settings, I鈥檒l go to the next step. The Mapping step lets you customize the data going out. It presents you with recommendations to map the source field, which is the XDM source field from the schema, to the attribute fields. You can add a calculated field as well. I encourage you to explore all of the options available to you there. You can also add a new mapping field. If you want to send additional attributes beyond the recommended field mappings, you do that by opening this modal and choosing the fields. I鈥檓 going to leave most of the recommended fields, except the last field, which I鈥檒l remove by clicking the Delete Mapping icon next to it. Mandatory attributes ensure that exported profiles contain specific attributes, like email address, and only those profiles with the attributes are exported. The duplication keys, on the other hand, help identify and handle duplicate records, allowing you to specify fields to identify duplicates and choose how many unique duplicates to keep. I won鈥檛 make any further changes, so I鈥檒l select Next above. This is the review step where I can verify my settings. Once everything looks good, I鈥檒l select Finish in the upper right corner. That鈥檚 it. I receive a confirmation that my destination has been successfully saved. After this, you would connect to the Azure Storage account to confirm that you see the files there. This concludes the demo for configuring and sending Experience Platform data to the Azure Blob destination. Thanks for watching!
recommendation-more-help
9051d869-e959-46c8-8c52-f0759cee3763