Ingest the data

  1. In the Platform user interface, select Datasets in the left navigation

  2. Open your Luma Loyalty Dataset

  3. Scroll down until you see the Add Data section in the right column

  4. Upload the luma-loyalty.json file.

  5. Once the file uploads, a row for the batch will appear

  6. If you reload the page after a few minutes, you should see that the batch has successfully uploaded with 1000 records and 1000 profile fragments.

    Ingestion

NOTE
There are a few options, Error diagnostics and Partial ingestion, that you will see on various screens in this lesson. These options aren’t covered in the tutorial. Some quick info:
  • Enabling error diagnostics generates data about the ingestion of your data, which you can then review using the Data Access API. Learn more about it in the documentation.
  • Partial ingestion allows you to ingest data containing errors, up to a certain threshold which you can specify. Learn more about it in the documentation

Validate the data

There are a few ways to confirm that the data was successfully ingested.

Validate in the Platform user interface

To confirm that the data was ingested into the dataset:

  1. On the same page where you have ingested the data, select the Preview dataset button on top-right

  2. Select the Preview button and you should be able to see some of the ingested data.

    Preview the successful dataset

To confirm that the data landed in Profile (may take a few minutes for the data to land):

  1. Go to Profiles in the left navigation
  2. Select the icon next to the Select identity namespace field to open the modal
  3. Select your Luma Loyalty Id namespace
  4. Then enter one of the loyaltyId values from your dataset, 5625458
  5. Select View
    Confirm a profile from the dataset

Validate with data ingestion events

If you subscribed to data ingestion events in the previous lesson, check your unique webhook.site URL. You should see three requests show up in the following order, with some time in between them, with the following eventCode values:

  1. ing_load_success—the batch as ingested
  2. ig_load_success—the batch was ingested into identity graph
  3. ps_load_success—the batch was ingested into profile service

Data ingestion webhook

See the documentation for more details on the notifications.

Ingest data in batches with Platform API

Now let’s upload data using the API.

NOTE
Data architects, feel free to upload the CRM data via the user interface method.

Download and prep the data

  1. You should have already downloaded and unzipped luma-data.zip into your Luma Tutorial Assets folder.
  2. Open luma-crm.json in a text editor and replace all instances of _techmarketingdemos with your own underscore-tenant id, as seen in your schemas
  3. Save the updated file

Get the dataset id

First we let’s get the id of the dataset id of the dataset into which we want to ingest data:

  1. Open Postman
  2. If you don’t have an access token, open the request OAuth: Request Access Token and select Send to request a new access token, just like you did in the Postman lesson.
  3. Open your environment variables and make sure the value of CONTAINER_ID is still tenant
  4. Open the request Catalog Service API > Datasets > Retrieve a list of datasets. and select Send
  5. You should get a 200 OK response
  6. Copy the id of the Luma CRM Dataset from the Response body
    Get the dataset id