This might be due to the definitions or analyses being in different initiatives. When a piece of your use case is placed in an initiative, all other pieces either need to be in the same initiative, or they need to be marked as "Global Object".
- Why do I have the "mixed" group in my customer AB test segments?
Every time a new customer is shown your campaign, which is being AB tested, they are assigned either to a variant or to the control group. If they come again, they will always be assigned to the same group. However, it might happen that next time, the customer comes using a different device or an incognito window. In that case, they are seen as a new customer with a new cookie and hence can be assigned to a different group. Once that customer logs in (such as during a checkout process) Bloomreach Engagement identifies them and it is only now the system knows that it is actually the same person. Bloomreach Engagement will automatically merge these seemingly different customers into one. If they were assigned to different groups in those 2 sessions, we want to exclude them from any AB test evaluation. Therefore, we create the "Mixed" segment in our evaluations, where these "merged" customers fall.
Why do I see less customers with event banner.action = "show" than customers with ab test events for the same variant?
Event banner.action = "show" is usually tracked only once the web layer is fully displayed. If the page is reloaded before display of web layer, then no banner event is tracked.
Event ab test is tracked when customer get first assigned to some variant regardless of what is actually displayed in the web page.
For this reason, you can sometime see less customer profiles with at least one banner show variant than customer profiles with ab test event of the same variant.
The longer it takes for web layer to display from the trigger action, the higher the chance for user to reload the page and therefore not track the banner show event.
For this reason we recommend to keep an eye of web layer variant setting "Show After" or the total time it takes to render your web layer.
A/B test variant is always selected only once for each customer profile. In the case of scenario A/B test node, this means that variant is assigned only when the customer arrives to the node for the first time.
However, customer profile can pass through scenario multiple times (e.g when you use repeated scenario trigger). After each pass-through node, the customer´s A/B test variant will be reused. Campaign node stats are reporting customers passing through the node. For this reason, you can see a completely different variant ratio in a/b test node stats vs in node a/b test configuration. See the example picture below.
Most often the root cause is the scenario logic, which is for some reason disproportionally passing customers from one variant over other variants into the a/b test node
There can be several reasons for this:
- If you use AB testing, you might have been assigned to the control group, which is usually set to show the original. You can check whether this is the case by going to
Customersand checking what events have been tracked to your profile. Alternatively, you can download our Chrome extension (still in Beta stage) and see what events are being tracked live in the console.
If this is the case, then either change the AB testing or try opening the page in an incognito mode until you get assigned to a variant (the incognito mode changes your cookie every time you open it).
- In Settings:
- Check the condition "Show on" and make sure it matches the page that you are loading.
- If you use
Show on page containing/specific URL, make sure there are no spaces in/before/after the text.
- Check "Category": If the campaign is set to show only to customers with specific consent, your profile must have that consent too.
- If you specified any conditions in the "Audience" part, you must match them.
- If the "Display" option in Settings is set to "Once" or "Until interaction", then you will either need to change it or use the incognito mode to see the campaign again.
In general, this error is usually returned for API requests, which sent somehow incorrectly formatted request body (usually JSON).
Common root causes might be:
- Valid JSON format, but invalid JSON content (missing mandatory fields and so on)
- Invalid JSON format.
The first root cause requires careful reading of specific API endpoint documentation, especially about mandatory fields.
The second root cause might return following example of a general HTTP 400 error.
BadRequest: The browser (or proxy) sent a request that this server could not understand
In such case, try to verify whether your request body is not automatically compressed by gzip.
If yes, disable gzip API compression.
Updated 13 days ago