Understand Lift and Confidence reporting in Auto-Allocate activities
In Auto-Allocate activities, the first experience (by default named Experience A) is always defined as a “Control” experience on the Reports tab. This experience is not treated as a true statistical control in the modeling used to determine the performance of experiences, but it is treated as a reference or baseline for some figures in the report.
The “Lift” numeric value and 95% bounds for each experience are always calculated with reference to the defined “Control” experience. The defined “Control” experience cannot have lift relative to itself, so a blank “—” value is reported for this experience. Unlike in A/B tests, in Auto-Allocate tests, if an experience performs worse than the defined control, a negative Lift value is not reported; instead “—” is displayed.
The displayed Confidence Interval bars represent the 95% confidence interval around the mean estimate of an experience’s conversion rate. These bars are also color-coded with respect to the defined “Control” experience. The “Control” experience’s bar is always colored gray. The portions of confidence intervals below the “Control” experience’s confidence interval are colored red and the portions of confidence intervals above the “Control” experience are colored green.
A winner is found when the leading experience’s 95% Confidence Interval is not overlapping with any other experiences. The winning experience is designated with a green star badge to the left of the experience name and in the “Winner” banner. When no star is visible, the banner reads “No Winner Yet” and a winner has not yet been found.
A “Confidence” number is also reported next to the currently leading or winning experience. This figure is reported only until the leading experience’s Confidence reaches at least 60%. If two experiences are present in the Auto-Allocate activity, this number represents the confidence level that the experience is performing better than the other experience. If more than two experiences are present in the Auto-Allocate activity, this number represents the confidence level that the experience is performing better than the defined “Control” experience. If the “Control” experience is winning, no “Confidence” figure is reported.
Frequently Asked Questions
Consider the following answers to frequently asked questions:
It has been a few days into the activity. Why are all confidence values still showing 0%?
Any of the following reasons describe why 0% displays in the report’s Confidence column for all activities:
-
Manual A/B tests and Auto-Allocate use different statistics to display Confidence values.
Manual A/B tests use p-values based on Welch’s t-test. A P-value is the probability of finding the observed (or a more extreme) difference between an experience and the control, given that in reality there is no such difference. These P-values can be used only to determine whether observed data is consistent with a given experience and the control being the same. These values cannot be used to determine if an experience is different from another experience (not control).
Auto-Allocate shows the probability of a given experience being a true winner across all experiences in the activity. Only a winning experience (which is most likely to be the winner), has a non-zero confidence value. All others are most likely to be losers and display 0%.
-
Auto-Allocate starts showing confidence only after the winning experience gathers 60% confidence. These confidence levels typically appear in about half the time that a normal A/B test would take to complete (although this time frame is not guaranteed). To determine how long a normal A/B test would run, use the Adobe Target Sample Size Calculator: plug control’s conversion-rate in “Baseline conversion rate,” “5%” for “Lift,” and 95% for “Confidence.” Typically, confidence starts showing after each experience has amassed at least 50% of the required samples per-experience. This gives you an idea of when confidence starts appearing.
-
If the report is showing 0% across the board, it is likely too early into the activity.
Are the “No Winner,” “Winner,” and “star” badges available for Auto-Allocate activities that use Analytics as the reporting source (A4T)?
The “No Winner Yet” and “Winner” badges are currently not available in the A4T panel in Analysis Workspace. These badges are also not available if the same report is viewed in Target. A winner “star” badge shown in a Target report for an Auto-Allocate activity using A4T should be ignored.
For more information about this and other limitations and notes, see Auto-Allocate in A4T support for Auto-Allocate and Auto-Target activities.
Adobe Target Maturity Webinar Series
Adobe Customer Success Webinars
Tuesday, Feb 4, 4:00 PM UTC
Adobe Target innovations, including GenAI, and best practices on AI-powered personalization and experimentation at scale.
Register