Constructor Blog | Ecommerce Search Industry and Product Information

A/B Testing Strategies for Recommendations | Constructor

Written by Nate Roy | Apr 22, 2025 9:00:00 AM

Product recommendations can make the difference between a single-item purchase and a full cart, but with limited real estate across your homepage, product pages, and cart, every decision about how to design, display, and populate them matters. 

Let's explore how strategic A/B testing can help you validate your recommendation strategy decisions and drive better results.

Homepage Recommendations 

There are many creative ways to embed recommendation pods into homepages. Consider testing both merchandising themes and searchandising strategies within each section.

For example, a large American photo and video equipment retailer’s “Deal Zone” could be manually merchandised or dynamically populated based on rules or personalization. A/B testing can quantify which strategies drive more engagement and revenue.

A large American photo and video equipment retailer’s homepage ‘Deal Zone’ recommendation pod

Tabbed recommendation pods are trending on ecommerce homepages. For example, the luxury department store chain below lets shoppers browse through several themes like new arrivals under $100, work apparel, makeup, and new markdowns. Consider testing default tabs (first visible) against each other.

Luxury department store chain’s tabbed homepage recommendation pod

Product Page Recommendations 

Because product detail pages (PDPs) are typically your most complex templates, it’s important to test recommendation pod design and placement as well as your recommendation strategy.

Recommendation pods present a competing call to action to “add to cart;” you’re asking shoppers to make their decision a bit more complicated, but when used effectively, it can boost CVR, RPV, and AOV.

Pod recipes

If you already present a combination of recommendation pod types on your PDPs (pod recipes), a great test to start with is their stack rank, or the order in which they are shown.

For example, Sephora uses “You May Also Like,” “Compare Similar Products,” and “Use it With” pods. Because not all users will scroll to see all three (especially on mobile), a simple experiment testing pod order (A/B or A/B/C) is a great place to start before testing rules within the pods.

‘Compare similar products’ first (control)    

 

 ‘You may also like’ first (proposed variant B)

 

‘Use it with’ first (proposed variant C)

New thematic pods 

Beyond the standard “you might like” and “more like this” varieties, consider testing attribute-level recommendations, like more in this color, pattern, fabric, scent, or finish.

A/B testing tells you how effective these pods are relative to your standard recommendations.


More in this [color]

More in this [fabric]

Or try a pod of deeply discounted recommendations (but remember, revenue and AOV may be lower compared to other pod strategies, even when click-to-conversion is high).

“Last call” recommendation pod

Alternative Product

Within Constructor, you can also serve alternative product recommendations to steer shoppers toward a featured item, upsell, or otherwise attractive option, such as an item from a brand the customer has frequently bought from or added to favorites.

For example, the premium consumer electronics retailer below suggests an upsell for the ‘next model in this lineup’ and includes a short summary of product specs inside the component.

How a premium consumer electronics retailer displays an alternative product upsell

 

🧪At Constructor, in internal A/B tests we found that complementary recommendation results lead to significantly more users clicking and eventually adding to cart compared to alternative items.

While more people clicked on alternative items, complementary recommendations drove conversions and larger basket sizes.

The alternative strategy (upselling) drove +8.1% in users clicking recommendations.

The complementary strategy drove +11.6% in users clicking on a recommendation followed by an add-to-cart and +13.6% in clicks followed by a purchase.

 

Cart Cross-Sells 

As with all recommendation pods, your A/B testing use cases are virtually unlimited, but showing recommendations in the cart is a hot topic in itself. Many conversion optimization professionals argue that you should keep distractions out of the cart to increase proceed to checkout actions.

Others argue that cross-selling is a great tactic to increase AOV and revenue, especially on mobile where continuing your shopping journey through your navigation menu carries more friction.

It’s not just a matter of showing cross-sells or not though. It also depends on how your cart page (or mini cart drawer) is designed, your merchandising strategy, and how cross-sells display once a cart is full of items.

Global show-hide test 

The simplest experiment (and the one you should start with) is a show/hide A/B test. 

This enables you to gauge how receptive your shoppers are to cross-sells in general.

An example cart with cross-sells

 

An example cart without cross-sells

 

A losing test doesn’t mean you need to scrap the idea of cross-selling in your cart altogether. Consider re-testing with different merchandising strategies or UI patterns to see if you can close the gap.

Once you have a winning recipe, you can more confidently proceed with keeping cart cross-sells live, and experiment with more granular rules.

Recommendation orientation 

Testing a horizontal scroller against a vertical pattern can help you determine what has the best impact on revenue.

A cart with horizontal cross-sells    

 

A cart with vertical cross-sells

 

Tabbed cross-sells

Merchandising the mini-cart is challenging with limited real estate on both mobile and desktop.

Tabbed recommendations enable you to pack more into the cart and give users some choice. You can also set up your experiment to switch the default visible tab from category A to B, for example.

A cart with tabbed cross-sell recommendations

Add-to-Cart Interstitials

Modals give you more room to work with on both desktop and mobile. Consider testing them against traditional cart cross sells.

Add-to-cart confirmation modal populated with accessories related to the product added

You can also A/B test designs against each other. For example, the men’s clothing brand below has redesigned their add-to-cart modal, which may involve A/B testing in the revision process.

Add-to-cart confirmation modal populated with complementary products

 

Add-to-cart confirmation modal with redesigned UI

Next Steps 

The power of product recommendations lies not just in having them but in using them strategically, informed by real user behavior. 

Through systematic A/B testing of placement, content, and presentation, you can build a recommendation strategy that drives meaningful results. Start with your highest-traffic areas, measure impact carefully, and keep optimizing based on what the data reveals. 

For more A/B testing insights and strategies from the Constructor experimentation team, check out the Experiments Blog.