How can we calculate confidence for auto-allocate activities manually?

Description

Confidence of auto allocate activity is different in Target UI and when calculated through Confidence calculator excel. Why is that?

Resolution

Auto-Allocate uses a different confidence calculator (based on Bernstein bounds) when compared to regular A/B tests (which is where the calculator you used will apply; that uses a t-test i.e. p-values derived from the t-distribution).
The confidence calculations used by Auto-Allocate are a lot more conservative than those used by regular t-tests. This is to control for false positives, i.e. prevent us making too early a determination that one arm is better than another, when in reality the effect we’re seeing is due to randomness.

So that’s why a classical t-test based confidence will be 93%, but the auto-allocate confidence is much lower. This has been also shared in our documents here [https://experienceleague.adobe.com/docs/target/using/activities/auto-allocate/automated-traffic-allocation.html?lang=en#section_98388996F0584E15BF3A99C57EEB7629](/docs/target/using/activities/auto-allocate/automated-traffic-allocation.html?lang=en#section_98388996F0584E15BF3A99C57EEB7629"Click to follow link: https://experienceleague.adobe.com/docs/target/using/activities/auto-allocate/automated-traffic-allocation.html?lang=en#section_98388996F0584E15BF3A99C57EEB7629")

Also, 60% confidence level is required for auto allocate activities as mentioned here [https://experienceleague.adobe.com/docs/target/using/activities/auto-allocate/determine-winner.html?lang=en#section_C8E068512A93458D8C006760B1C0B6A2](/docs/target/using/activities/auto-allocate/determine-winner.html?lang=en#section_C8E068512A93458D8C006760B1C0B6A2"Click to follow link: https://experienceleague.adobe.com/docs/target/using/activities/auto-allocate/determine-winner.html?lang=en#section_C8E068512A93458D8C006760B1C0B6A2")





On this page