iSoftStone provides Adobe Target services to organizations of all sizes and all levels of experience. We’ve partnered with established Target users who want dev support, organizations interested in adding Target to Adobe Experience Manager, and teams that are entirely new to personalization and A/B testing.
No matter where our clients are in their journey, we’ve found five Adobe Target best practices that hold true. Each will help you get more from your A/B testing and personalization programs as you work to delight customers and boost conversions.
1. DO: Take advantage of the low barrier to entry.
Adobe Target is one of the most user-friendly solutions on the market. It’s relatively easy for non-technical team members to make straightforward design and copy modifications and to begin leveraging the power of automated personalization. Since Adobe Target tests don’t require page re-authoring or code deployment, they can be swiftly launched and stopped. The Visual Experience Composer (VEC) also integrates smoothly with Adobe Analytics, allowing your marketing team to access rich insights and reporting.
Think of Target as your safe space to experiment. Get creative with hypotheses and use the data to start conversations across the business.
2. DON’T: Make too many changes at once or test all-up redesigns.
Making swooping changes and going all-in on personalization can be tempting, especially in the context of an inspiring and innovative leadership vision, but incremental changes with specific and measurable goals will take you further. An example: we heard a story from a client whose team completely redesigned a store page for a best-selling product. They worked hard to deliver a beautiful, radically new design in line with a senior executive’s ideas. However, it didn’t perform well in the A/B test and our client couldn’t explain why. With so many changes in play, it was simply impossible to pinpoint exactly what was successful and what hadn’t worked.
Keeping tests focused and personalization limited will isolate variables, enable you to track changes in performance, and deliver clean, actionable data.
3. DO: Use the Auto-Allocate feature.
Standard A/B tests split traffic 50-50 and the distribution remains fixed. In real terms, when a modification is unsuccessful it can potentially (if briefly) expose your business to lower engagement or lost revenue. Adobe Target’s Auto-Allocate feature can increase conversions from the very start of your test. It monitors the goal metric of the performance and intelligently allocates site traffic to the better performing design, moving visitors gradually and proportionately towards the winning experience.
If the purpose of your test is to identify top performers, Auto-Allocate is an ideal solution to optimize activity and mitigate risk.
4. DON’T: Go it alone.
Yes, as we noted, Target is a truly user-friendly solution, but the best A/B testing and personalization is still a cross-functional team effort. You don’t have to do it all yourself. For example, bring in someone familiar with data analytics who can dive into the details and interpret the nuances. You should also consider partnering with a vendor that has robust Adobe Experience Cloud expertise. Front-end developers can write custom HTML, CSS, and JavaScript and deliver more technically sophisticated modifications than are typically achievable through the Visual Experience Composer.
If you’re looking to boost overall performance, create a true digital destination, and take your customer experiences to the next level, a developer will help you get there.
5. DO: Implement the highest performing variations across your site.
We’ve seen firsthand how clients benefit from taking full advantage of test results and learnings from Adobe Target. We recently supported a SKU chooser module redesign that performed so well in the test our client’s sister teams were reaching out and asking to use it on their pages. A sitewide change came about as an organic benefit and now drives additional revenue for our client every day. The company subsequently changed its business process to allow for cross-team agreement on test designs and visibility on the results.
We recommend this joined up approach for securing buy-in and making insights actionable. After all, the end goal of testing is to build the most compelling experience for your users and customers. If you’ve proven out a hypothesis, you’ll want to be able to go ahead and scale the impact.