## Issue 1: Why can a customer belong to both a variant and a control group when AB testing? Why do I have the "mixed" group in my customer AB test segments?

Every time a new customer is shown your campaign, which is being AB tested, they are assigned either to a variant or to the control group. They will always be assigned to the same group if they come again. However, it might happen that the customer will come using a different device or an incognito window next time. In that case, they are seen as a new customer with a new cookie and hence can be assigned to a different group. Once that customer logs in (such as during a checkout process), Bloomreach Engagement identifies them, and it is only now the system knows that it is actually the same person. Bloomreach Engagement will automatically merge these seemingly different customers into one. If they were assigned to different groups in those two sessions, we want to exclude them from any AB test evaluation. Therefore, we create the "Mixed" segment in our evaluations, where these "merged" customers fall.

## Issue 2: Why do I see less customers with event banner.action = "show" than customers with ab test events for the same variant?

Event `banner.action = "show"` is usually tracked only once the weblayer is fully displayed. If the page is reloaded before the display of the web layer, then no banner event is tracked. Event `ab test` is tracked when customers are first assigned to some variant regardless of what is displayed on the web page. For this reason, you can sometimes see fewer customer profiles with at least one banner show variant than customer profiles with an `ab test` event of the same variant. The longer it takes for the weblayer to display from the trigger action, the higher the chance for the user to reload the page and, therefore, not track the banner show event. For this reason, we recommend keeping an eye on the weblayer variant setting `Show After` or the total time it takes to render your weblayer.

## Issue 3: Why do you see different scenario A/B test ratio in node stats than in node configuration?

The A/B test variant is always selected only once for each customer profile. In the case of scenario A/B test node, this means that the variant is assigned only when the customer arrives at the node for the first time. However, the customer profiles can pass through the scenario multiple times, such as when you use a repeated scenario trigger. After each pass-through node, the customer's A/B test variant will be reused. Campaign node stats are reporting customers passing through the node. For this reason, you can see a different variant ratio in the A/B test node stats compared to the node A/B test configuration. See the example picture below. Most often, the root cause is the scenario logic, which is, for some reason, disproportionally passing customers from one variant over other variants into the A/B test node

368