DocumentationTutorialsExperience Cloud integrations

Integration

Integrate Analytics with Target

Last update: September 29, 2023
  • Topics:
  • Integrations

CREATED FOR:

  • Beginner
  • Leader
  • Developer
  • Admin

Integration value and setup

The videos below show the value of using this integration, as well as details regarding getting the integration set up.

NOTE
These videos demonstrate the implementation and validation for Target at.js and Analytics appMeasurement.js. Please refer to the documentation for required library versions in both tools.

Setting up A4T (Analytics for Target)

In this video, meant for a developer, you learn how to:

  • Explain how Analytics and Target requests bind up using SDID
  • Describe implementation requirements for Adobe Analytics with Adobe Target (A4T)

video poster

https://video.tv.adobe.com/v/35146/?quality=12&learn=on

Use Analytics as a Data Source for Target

In this video, meant for a business practitioner, you will learn:

  • What is A4T and why would I use it?
  • How does A4T work?
  • What are the pre-requisites to using A4T?

video poster

https://video.tv.adobe.com/v/17384/?quality=12&learn=on

Transcript
Hi, I’m Kimen Warner, a group product manager for Adobe Target. In the next few minutes, I’m going to share how to start using Adobe Analytics as a reporting source for your optimization activities in Adobe Target. We call this A4T, or Analytics for Target. With this, you can start using all of your analytics data to drive the analysis of your optimization program. In this video we’ll cover what is A4T and why would I use it, how it works, and then also what are the prerequisites needed to start going. In the next few minutes, I’m going to share how to start using Adobe Analytics as the reporting source for your optimization activities in Adobe Target. We usually call this A4T, or Analytics for Target. With this, you can start using all of your analytics data to drive the analysis of your optimization program. So here’s what it looks like. I’m looking at a report for an AB test and target. I can see my different experiences and their performances, both in a graph view up here and in a table view down below. This looks just like any other Target activities report, but the data showing here is getting displayed in real time from Adobe Analytics. This means that when I change the conversion metric here, maybe to average visit depth, then the data that I see here is coming directly from analytics. I can also change the audience. And the options I get for the segments are all of the segments in my Analytics account. I can choose one of them here, and apply the results here. So I can choose any metric or segment without having to specify it for the test before I started to run it. This means that when someone asks you something like, “Hey, how did this test affect membership signups?” and you hadn’t thought of that before you ran the test, then no problem. You can just pick that as a metric here, apply it on the test, and see what the results would be. I can see the same report in Analytics too. I can click on this link to take me directly to the activity report in Analytics. From here, I can see lift and confidence, just like I can in Target. I can also change the metric to something else, let’s say, like page views, and see the results here, and the lift and confidence according to that metric. I can even apply a segment, like maybe visit some organic sources, or multiple segments, and then I can see the test results narrowed down by that segment. Now, one of the best things about this integration is that I can either look at segments that I didn’t think of before, or I can go in and add a new segment and apply the results retroactively to the test. Again, either in Analytics or in Target. I can also use the download capabilities, the sharing capabilities, and the dashboarding capabilities in Analytics. Having all this in Analytics also makes it easier for optimization teams to work with analysts at their company who are more comfortable in Analytics than in Target. So how does this work? Well, when you create an activity in Target, you get to choose if you want to use Analytics as your reporting source. You do that on the third step of the activity creation here, and you can choose at the Reporting settings to use Target or Analytics. I’ve chosen Analytics, which means I see my company name here, and I can also choose the appropriate report suite where I want all my test results to go. Finally, I get to choose my metric. I can choose a custom one on the page, like click tracking, or if they viewed an inbox, or viewed a page. I can also choose an existing analytics metric from the entire list of analytics metrics I already have in the system. You’ll also notice that I don’t specify any segments here. That’s because it’s done ad hoc when you view the reports. Again, either on the Target reports or in Analytics reports. The optimization data is flowing from Target to Analytics on every call to Target from your visitors through our servers directly, and not through the visitors webpage. This ensures that there’s no data variance between those two systems. Now, you probably want to dive in, so let me walk through the steps to get started. The first step is to be on the right version of Adobe Target and Adobe Analytics. So that means you need to be on Target Classic, Target Standard, or Target Premium, or Analytics Standard and Premium. Step two is to make sure you’re on regional data collection. This is a capability of Analytics. Most of our customers are, but definitely worth taking a look. Steps three and four are to make sure you’re on up to date libraries, and also deploy the Marketing Cloud ID service. That means being on VisitorAPI.js file version 1.4 or newer for the Marketing Cloud ID service. For analytics, that means AppMeasurement, v1.4 or newer, or s_code version 27.3 or newer. And for Target, it means mbox.js, version 58 or newer, or at.js. The last step is to request access. So if you go to this website, Adobe.com/go/audiences, enter your information and we’ll get it set up for you. Once these steps are complete, you’re ready to go. That’s all you need to get started. Now you can start thinking about all the optimization opportunities when you can use all of your data for Analytics to power your activities.

Common use cases

The videos below show different features, activity types, and benefits of integrating via A4T.

Analytics for Target (A4T) Panel in Analysis Workspace

The Analytics for Target (A4T) panel lets you analyze your Adobe Target activities and experiences, with lift and confidence, in Analysis Workspace.

video poster

https://video.tv.adobe.com/v/37247/?quality=12&learn=on

Transcript
Hi, this is Jen Lasser with Adobe Analytics product management. In this video, I’m going to give you an overview of the Analytics for Target panel in Analysis Workspace. For those that aren’t familiar with Analytics for Target or A4T for short, it’s an integration between Adobe Target and Adobe Analytics. It enables Adobe Target to use Analytics data as its data source. And it enables Target users to analyze their Target activities and experiences, in Adobe Analytics. This unlocks deep Analytics capabilities, available through Analysis Workspace, such as freeform analysis, deep segmentation, journey visualizations, and much more. So, let’s take a look at what this Analytics for Target panel does for you. It’s available in the left rail under panels, and you simply drag it over to the middle. The first thing you want to do when you bring over the A4T panel, is select the activity you want to analyze. You can choose from a list of activities, and the list will be populated by the last six months of activities. Or you can drag over one from the left rail, just like you normally would. So, we’re going to select the AAC landing page activity. And notice, it happened really fast, but we retrieved a bit of information for you. We brought in all the experiences that relate to this activity, and we’ve selected one as the control. You can always change that through this drop down. We also updated the calendar date range, to reflect the active date range for this activity. This is passed over from Adobe Target. And, if you deactivate your activity in Target, there will be an end date applied here as well. You can select your normalizing metric. Visitors is typically the default, but Visits and Impressions are also an option. Normalizing metric is used as the denominator in the Lift calculation. The last thing you need to do is choose success metrics. And these can be any standard success metric, in Adobe Analytics. You can choose up to three success metrics and these will all have Lift and Confidence, calculated for them. You can search through the drop down, or you can type in what you’re looking for. So, I’m going to go ahead and add three different metrics here so you can see how this renders in the panel. Just like with the other drop downs in this building experience, you can also drag and drop over metrics from the left rail. The advantage of picking them from the drop down list, however, is that we’ve pre-populated the list with all of the acceptable or supported metrics, in this panel. So, we’ll go ahead and click Build. Like with many of our other panels, there is a progress bar up at the top, which will give you an indication of how long the panel is going to take to build. So, let’s take a look at what the panel built for us. First of all, at the top, there’s a summary line that shows you all the settings that you selected. Now, for each metric that you chose in the building state, we will output one table, and one trended conversion rate graph for you. Starting with the table, experiences will be down the rows, with the control experience in bold. In the columns, the first column will be the normalizing metric you selected. The second column will be the success metric you selected. And then, conversion rate is the division of those two. Lift and Confidence are now added to the table and they work off of the conversion rate column. Lift Mid is what you traditionally saw on Reports and Analytics. It’s a midpoint of the Lift range. Lift Lower and Upper are the lower and upper bounds of that range. If at any point, you want to understand the definition of these metrics, you can always hover the eye here, to get a better understanding of what they mean. Confidence is the final metric in this table. And, it also has a definition. And it’s based on a student’s T-test model, just like it was in Reports and Analytics. We’ve also applied a static or custom conditional formatting range to the Confidence column. And it has thresholds of 75,85, and 95%. So, the red, yellow, green coloring will change, as it approaches those different midpoints. To analyze this table, you want to focus on the Control row. Notice that Lift and Confidence with the Control will be zeroed out because that’s what all the variance experiences are compared against. And then, you want to look at the variance compared to that. What is their Lift and Confidence of that lift, compared to the control? Now, down below the table, we also have a conversion rate chart, for the metrics you selected. Now, this is demo data, so it’s not producing that great of a chart. But you can see here that we have each experience, and we have its trended conversion rate. At any point, if you want to change the granularity of this chart, you can go ahead and click the gear, and then, change this to any other granularity you’d like. The second and third metrics that you selected, when you built the panel, will be repeated in the same format further down. So, we selected activity impressions, which will have its own table and visualization. And then, our third metric was page URL instances. And we output a table and a visualization for that as well. Now at any point, if you want to make any edits to your panel, you can simply click the Edit pencil, and change your inputs and then rebuild. And, if you have any questions, you can click the question mark up here at the top and it will explain all the builder parameters. And link out to more rich documentation if you need assistance. This has been an overview of the Analytics for Target panel in Analysis Workspace. We hope that both Target users and analysts alike, get a lot of value out of being able to deeply analyze their Target activities and experiences, with Lift and Confidence in Analysis Workspace. -

Analyze an Auto-Target Activity using the A4T Panel

In this video, you will learn how to use the Analytics for Target panel to visualize results of an Auto-Target test.

video poster

https://video.tv.adobe.com/v/333270/?quality=12&learn=on

Transcript

Hi, I’m Kati McKinney, an Expert Solutions Consultant for Adobe Analytics. In this video, I’m going to show you how to use the Analytics for Target integration, to visualize results of an Auto-Target test. Analytics for Target, A4T is the integration between Adobe Analytics and Adobe Target, where you can use Analytics to analyze Target results. An Auto-Target test in Target leverages AI to serve the most tailored experience to each visitor based on individual customer profiles and the behavior of previous visitors with similar profiles. Due to this, measuring the results of an Auto-Target activity is different than measuring the result of a simple A/B test. So today, I’ll show you how you can easily measure results of this activity via Adobe Analytics. So, let’s get started. First, I’m going to drag in the Analytics for Target panel into my workspace. I want it to set a date range of my test, January 1st through the 13th, in this case. And I have a few prompts here. First, Target Activity. I’m going to go ahead and choose a test that I want to measure. And we can see when I did that my Control Experience’s updated. For Control Experience, you can choose any choice because we’re going to overwrite this later. This is because the Control Experience in an Auto-Target test is really setting a control strategy to serve random experiences. Next, we’re going to choose a Normalizing Metric. I’m going to choose Visits for an Auto-Target test. Always choose Visits as a normalizing metric for this type of experience. Auto-Target personalization selects an experience for a visitor once per visit which means the experience can change upon every visit. If we were to use unique visitors, a single user could see multiple experiences which would mean that the conversion rate was misleading. And then finally, we’ll choose Success Metric of Activity Conversions. You should generally view reports with the same metric chosen for optimization when you set up your activity within Target and we are going to go ahead and hit, Build on this panel. And we’ll see two visualizations once this builds, a Freeform table and a Line graph. We’re going to spend most of our time in this Freeform table. Couple of things to note, Lift and Confidence are not available for Control versus Targeted dimensions on Auto-Target activities. You can compute these manually by downloading the Confidence Calculator because of that we’re going to go ahead and remove them from our table. The goal of a standard A/B Target test is to understand the Experiences versus Control. For an Auto-Target test, we need to compare the control strategy versus the targeted strategy. So, we are going to select Targeted versus Control and drag those replacing Target Experiences. Now, to gain further insight into how the model is performing, we want to break down Targeted versus Control by those experiences. So, we’ll go to the dimension of Targeted Experiences and select experiences, A, B, and C. Drag them into Control to break down further and we’ll do the same with Targeted. All right, we have a few more changes that we need to make. First, we need to create a filter for visits. Why do we need to create a filter here? While Adobe Analytics default counting methodology may include visits where a user did not interact with the Target activity. To ensure that our audience interacted with the activity, we need this filter. So, let’s go ahead and create a new segment. We will title it and we can bring in, Target Activities and we’ll want to make sure that that equals the activity that we’re measuring currently. And then finally, we want it to be an Instance Attribution model, and we will go ahead and save.

Now that we have that created, we can drag it under Visits and we’ll only be seeing Visits that had that specific hit for a Target activity. Next, we need to align the Attribution model between the machine learning model and the goal metric. So, Target cannot wait on the default Attribution model within Analytics which is 30 days in order to train its models. So, we need to change the model on the goal metric to be the same touch, otherwise, we would have discrepancies. So, we’re going to hit the gear icon and choose a non-default Attribution model.

And we’re going to use participation in this case with a look back window of Visit and we’ll go ahead and hit, Apply. Finally, we need to create a calculated metric on the conversion rate to leverage both the right attribution model and the filtered Visits metric. So, we are going to create a new metric and we’ll name it.

We want to have this as a percentage with two decimal places.

And now, we’ll drag in our Activity conversions. And again, we need to change the Attribution model just like we have in our Visualization. So, we’re going to choose Participation with a visit look back, and we’ll go ahead and apply that.

And now we need to add another container and we will divide it. And we’re going to bring in that filter we just created and Visits. So, we can ensure we’re only seeing visits with that particular hit.

And we will save this calculated metric. Now, we can drag this in and replace conversion rate.

So this conversion rate will take into account the participation Attribution model for Activity conversions, as well as, the visits with the filter. So this is what your final workspace should look like after using the A4T panel for an Auto-Target test and making just a few tweaks. So, what I see here is that indeed, my Target experiences outperformed my Control experiences. Remember, the machine learning model was changing experiences to drive more conversions during the test. So, the Target experiences should always outperform Control. This is demo data. So, it’s not as compelling as your data maybe. To learn more about the important attributes in your tests, navigate to the Reports portion of Target and click on the Important Attributes icon. Thanks for watching. - -

We also have two step-by-step tutorials showing you the details for setting up A4T reports in Analysis Workspace for “auto-allocate” and “auto-target” activities:

Setting up A4T reports in Analysis Workspace for Auto-Allocate activities

An Auto-Allocate activity identifies a winner among two or more experiences and automatically reallocates more traffic to the winner while the test continues to run and learn. The Analytics for Target (A4T) integration for Auto-Allocate allows you to see your reporting data in Adobe Analytics, and you can even optimize for custom events or metrics defined in Analytics.

Set up A4T reports for Auto-Allocate activities

Setting up A4T reports in Analysis Workspace for Auto-Target activities

The Analytics for Target (A4T) integration for Auto-Target activities uses the Adobe Target ensemble machine learning (ML) algorithms to choose the best experience for each visitor based on their profile, behavior, and context, all while using an Adobe Analytics goal metric.

Although rich analysis capabilities are available in Adobe Analytics Analysis Workspace, a few modifications to the default Analytics for Target panel are required to correctly interpret Auto-Target activities, due to differences between experimentation activities (manual A/B Test and Auto-Allocate) and personalization activities (Auto-Target).

Set up A4T reports for Auto-Target activities

recommendation-more-help
6cc49b4d-22d2-4aea-b5a9-a228666a600e