4. Our journey started more than a decade ago… 4 Early Adoption Standardization Maturity Regional implementation Global implementation Standard KPI’s – global dashboard Regional KPI 1:1 BT experience Industry leading testing capability Internal testing capability Full funnel testing 2008-2009 2010- 1996-2007 In-house clickstream tool Advanced segmentation VOC solution Global team across BU Regional team Global team by BU Limited expertise and talent bench Work closely with vendors & consultants Grow in-house expertise Low
5. Online: Dell.com + Premier + Global Portal + B2B + Channel Portal + eSupport + Community 725K B2Btransactionsper year 34 supported languages Googled 1M timesper day 170M visitsto community properties 58Kchannelpartnersonline 1Bvisits per year to Dell.com 190KPremierPages 13Bpage viewsin FY’09 Dell Online Today 5 An online orderis placedevery2 seconds
6. Public Services SMB Large Enterprise CSMB Consumer Scale & Complexity Bring Opportunity 6 Global Online Team
7. Centralized Analytics Operations 7 Tool Strategy/Support Global Online Team Governance Implementation BU / IT Partnering Vendor Management Training / Adoption Global Online PlatformDynamic, Scalable & ReliableeCommerce Platform
14. 14 Dell.com Online Analytics, Testing & Targeting To be thetechnologye-commerce leaderby offering the mostrelevant experienceand solutions todrive high customerloyalty To become an innovation engine through insightful analytics &the best-in-class testingcapability to enable constant and rapid changes
15. People 15 Double the size of the team On-boarding process to provide support and create connections On-going training & development plan If I were running a company today, I would have one priority above all others: to acquire as many of the best people as I could. Jim Collins
17. Accountability 17 2x Revenue upside Y/Y 400-600 tests globally Monetization model based on test results with statistical significance Complete transparency & alignment with finance & business Because they lack accountability, they fail to achieve credibility, and they have no authenticity. Jim Collins
18. Process 18 pipeline roadmap Brainstorm Recipes Institutionalize learning implement Communicate Analytics Manager Lead Test Manager Lead Sign off required Documentation required
21. Understanding Customer Intent 21 Specific product offers for customers desiring an efficient shopping experience Improved site navigation for customers want to take more time to find the right product
23. Checkout Testing - July Checkout Testing - August Checkout Testing - Sept CA Cons US Cons UK Cons US Cons DE Cons US SMB MX Cons CN Cons US Cons US Cons US SMB JP Cons AU Cons CN Cons CN SMB JP SMB AU SMB MX SMB BR Cons CA SMB US Cons US Cons AU Cons FR Cons US Cons MX SMB BR SMB US Cons AU SMB Testing Globally…an Iterative Process 23
26. Embracing a New Culture:testing as a “religion” 26 Testing baked into dell.com projects from beginning Tests are run rigorously with clearly defined process Results captured and shared widely, good or bad
27. 27 “It took Einstein ten years of groping through the fog to get the theory of special relativity, and he was a bright guy.” Jim Collins
28. 28 WAA Austin Symposium - the art & science of personalization January 24th
29. Join the winning team! I am hiring a testing manager for Greater China region There are a number of analytics/strategic positions open as well Send your resumes to ed_wu@dell.com 29
Hinweis der Redaktion
We have one of the largest websites in the worldLast Year, we had…1Billion VisitsFrom Half a Billion VisitorsWho viewed 8 Billion PagesWe Have ComplexityDell has an online presence in over 100 countries and is translated into 34 languages4 Global Business UnitsThousands of System Configurations Thousands of Online Stores
Focus on how centralized, global online platform, implementation, vendor management, allow BU’s to collaborate more productively.
Talk about having different calculations in different BU’s for same metrics. Used SiteCatalyst implementation to standardize. Dealt with resistance during implementation – would not implement with different definitions for different BU’s.
Customized Personal Experience: Not only personalization and behavioral targeting driven transactions, but… Customer choice focused ; Intent & context based Multi-dimensional ; Cross-platformCustomized social experience: Not only social media integration and ratings & reviews, but…Embedded in broader experienceContextual; Cross-platformCustomized Solution Experience: Not only rich product content and bundles, but… Configurable solutions; Global solutions development platform; Cloud
When we formed our global team and got better at sharing information with each other, we started to learn things that have made us much better at getting the most out of our testing program. I’d like to take a few minutes to share with you some of the most important lessons we’ve learned.A while back, we weren’t very organized or focused, we just tested a lot of pages.We figured, the more pages we test, the more winners we’ll find, and we’ll improve our site faster. But we noticed that our most impactful tests throughout the world tended to be on the same pages – just a few for each country.We began to realize that we would be better off testing a few pages multiple times throughout the quarter than to test as many pages as possible.Another strategic focus area for us is basket and checkout pages.Even though these pages get a tenth of the volume of our most visited pages, they have the most revenue going through them. That shouldn’t be surprising because every purchase must go through these pages.The other reason these are valuable is that this is where we have are highest quality customers. This is especially true for Dell because we have a more detailed shopping experience than most stores. Many of our customers spend a lot of time choosing components and accessories to build the system that is exactly what they want. Customers who are willing to do this are seriously considering purchasing. Some of our greatest successes recently has been in cart and checkout tests.Our last focus area is behavioral targeting. The reason this is important to us is that we have a very high success rate with targeting.In fact, most of the targeting campaigns we try are successful.Whenever we focus on a customer’s needs and show them relevant content, they have a better experience and are more likely to buy.
Its not enough just to know where on your site to test and which pages to focus on.You need to know what you should try – what you should change and how to change it.A common mistake by marketers is to think too much about what we want our customers to do. We’ll design a site around the way we want them to shop.It’s a good idea to turn that around and think about why your customer is on your site and exactly what they want to do. That’s an easy concept, but in reality it is very complicated because we all have so many different kinds of customers.A page that is designed perfectly for one customer might not meet the needs of another customer.Here’s one example of this from our site:Customers who are brand loyal and come to our site with the intention of purchasing expect a simple shopping experienceDon’t want a sales pitch, just want to see the product offersOther visitors are open to buying from Dell or one of our competitorsThey want to take their time making a decision and want reasons why Dell is better than our competitorsThey would prefer a site with clear navigation that makes it easy to find the information they are looking forIf we tested these two options against each other, one would likely come out on topSuppose the option on the left came out with a 3% advantage in revenue per visitorThat would be good, but doesn’t tell the whole storyIf we dig deeper we find that the option on the left works 10% better for Efficient Shoppers but 7% worse for our Cautious ShoppersThe best solution is to use a targeting campaign to provide the best experience to both group.Target Cautious Shoppers based on:Clicks on Help Me Choose and other learning contentView of customer reviewsVideosOther content that suggests a high level of customer engagement with the siteA great way to learn what works best for different types of customers is to do some segmentation analysis on your test results to see if the results vary for different kinds of customers.Sometimes a simple A/B test can result in a very successful targeting campaign. At the very least, you’ll learn more about your various customers, which will help you design better tests in the future.
I wanted to show you a really simple of example of targeting we do to show our customers more relevant content based on what we know about them.This is our homepage in the United Kingdom. When a new customer comes to this page, they see a rotational banner with products across all of our segments, including consumer and business products.However, once a customer browses one of our customer segment sites, when they return to the country homepage, they will only see banners related to that segment.For example, if a visitor views pages in the Consumer site then returns, they will see banners featuring our consumer brands. Our Small and Medium Business customers will see products that would be relevant to them.We’ve done similar things with banners inside our customer segment sites. For example targeting specific product families based on prior browsing behavior.You should look to target based on any criteria you find that seems to differentiate the way customers shop. That might be clues from prior browsing history, traffic source, first time visitors vs repeat visitors, or based on prior purchases.The bottom line is, if we know what content would be relevant to a certain type of customer, we should use targeting campaigns to make them feel like the site is designed around their needs.
One of the most significant changes to our test process is what we do with successful tests.Previously, if we had a successful test, we would implement the winner, have a short celebration, and then move on to some other idea.We now realize that we were missing out on a huge opportunity to drive more value out of every test we run.There are now two additional things we do every time we have a successful test.First, we take the things we learned from a test and attempt to design a new test that will perform even better.Even with a test that has positive lift, further analysis can reveal some things about the design that might not have worked like we intended. So, we’ll redesign the test and run it again to see if we can do even better.A good example of this was when we ran a series of tests related to adding a trustmark in our checkout pages.Our early tests revealed that we had an idea that worked, but we continued to test different trustmarks and different locations until we found a solution that worked best.The final solution had more than double the lift that we saw in our first successful trustmark test.The second thing we do with our winners is that we run similar tests in most or all of our key markets throughout the world. Once we found the best Trustmark solution in the US, we tested it in our other major markets.In essence, instead of running a bunch of stand-alone tests, we think of our testing program as an iterative process for providing a global solution.From a single test, we generate many new test ideas and designs, and we take our best ideas and test them throughout the world.
Not only are we trying to get more out of our successful tests, but we now try to learn as much as possible from our mistakes as well.This happened in our checkout testing program when we tested the use of a simplified cart. We learned to not change too much at once.This slide illustrates the concept of the test.The hypothesis of this test was sound: We believed that customers would prefer a simplified cart with as little text as possible.So, we designed a test that removed or simplified any content we felt was unnecessary. Then we ran an A/B test.We were confident this test would work, so we were surprised when we found out the new recipe was performing worse than the original.It appeared that we removed some content that our customers thought was helpful. But we changed so much, how could we know which changes were positive and which were negative.Fortunately, in this case we got lucky. There was a change that caused significant negative performance that involved navigation, so we were able to detect it through some detailed pathing analysis of our test results. The change that hurt us was one of the smallest changes that we made, at least visually. We moved the Continue Shopping link from the top of the page next to the continue button to the very bottom of the page.This caused a frustrating shopping experience for customers who wanted to add other product to their cart, or go back to do more research before finalizing their order.We learned a couple of good lessons from this testFirst, if you are going to make a change that could significantly impact the way a customer browses your site – like changes to links to other pages, it is best to test those independently. Moving this link didn’t really fit with the original hypothesis of this test.Second, if you want to understand the impact of changing lots of different elements on the page, it is best to set up the test so that you can understand the impact of each change independently. Either multivariate test or an A, B, N test
In closing, I’d like to say that even if you have the right resources in place, the smartest people, and great processes, getting the most out of your testing program takes time. It will take a lot of failures and successes for you to learn the things you need to know to optimize your site. So be patient and stay committed to it, and it will definitely pay off.