Linked Data in Production: Moving Beyond Ontologies
Google analytics and website optimizer
1. Google Analytics and Website Optimizer Let your visitors lead you By Simon Whatley http://www.google.com/analytics http:// www.google.com/websiteoptimizer
Exclude all traffic from a domain : use this filter to exclude traffic from a specific domain, such as an ISP or company network. Exclude all traffic from an IP address : this filter works to exclude clicks from certain sources. You can enter a single IP address, or a range of addresses Include only traffic to a subdirectory : use this filter if you want a profile to report only on a particular subdirectory (such as www.example.com/motorcycles)
Exclude Pattern : This type of filter excludes log file lines (hits) that match the Filter Pattern. Matching lines are ignored in their entirety; for example, a filter that excludes Netscape will also exclude all other information in that log line, such as visitor, path, referral, and domain information. Include Pattern : This type of filter includes log file lines (hits) that match the Filter Pattern. All non-matching hits will be ignored and any data in non-matching hits is unavailable to the Urchin reports. Search & Replace : This is a simple filter that can be used to search for a pattern within a field and replace the found pattern with an alternate form. Lookup Table : Selecting this filter allows you to select a lookup table name which may be used to map codes to labels you understand. Advanced : This type of filter allows you to build a field from one or two other fields. The filtering engine will apply the expressions in the two Extract fields to the specified fields and then construct a field using the Constructor expression. Read the Advanced Filters article for more information. Uppercase / Lowercase : Converts the contents of the field into all uppercase or all lowercase characters. These filters only affect letters, and will not affect characters or numbers.
Maximum of 4 Goals per profile, but you can have an unlimited number of Profiles.
A 'funnel' is a series of pages through which a visitor must pass before reaching the goal conversion. The name comes from a graph of visitors who reach each page - the first page counts the most visitors, and each successive page shows less visitors as they drop off before reaching the final goal. The purpose of tracking these pages is to see how efficiently your pages direct visitors to your goal. If any of the funnel pages are overly complicated, or not designed to be user-friendly, then you will see significant drop off and lower conversion rates. You can track drop-off rates on pages leading to a goal using the Funnel Visualization report in the Goals section. You can choose to make the first step in the funnel mandatory, by selecting the 'Required step' checkbox next to the funnel. If this checkbox is selected, users reaching your goal page without traveling through this funnel page will not be counted as conversions. Only users who reach the goal URL after viewing this first page will be counted as conversions. You can view the Funnel Visualization report for data on the funnel path. For example, if you are tracking user flow through your checkout pages, do not include a product page as a step in your funnel. Note that the 'Required step' checkbox only affects the Funnel Visualization report. It does not keep a goal from showing a conversion in any other goal report. http://analytics.blogspot.com/2008_04_01_analytics_archive.html
http://www.grokdotcom.com/2007/04/03/64-tips-for-getting-started-with-google-optimizer/ Calls to Action 1. Is your call to action distinctive? Does it stand out on the page? 2. Button language and/or hyperlink anchor text 3. B2B Concerns - Are you asking leads to submit? 4. Button Colours & Shapes Point of Action Assurances 5. Privacy policy: is it easy to understand? 6. Return policy and 3rd-party security verification: are they prominent? Fonts 7. Choice of font 8. Readability: how easy are your fonts to read? 9. To bold or not to bold. Headlines 10. Use fractions or percentages to prove your claim. 11. Ask questions in the headline. 12. Use words that evoke emotion. 13. Different types of formatting: bolding, fonts, colors, capitalizations, sizes 14. Number of words used in the headline 15. Use exclamation points. 16. Use text to convey the benefits versus the features of your products or services. 17. Self-focused (we/I) versus customer-focused text (you) 18. Use quotations in the headline (consider the length of the headline). 19. Reading level of the headline Product Copy 20. Is the copy more than just a blurb? 21. How enticing is the copy? Product Images 22. Present different Image sizes. 23. Pay attention to image quality, it affects the perceived value. 24. Does the image show the product in a different context? (If you are selling tents, are you presenting the package or an image of the constructed tent?) 25. Product zoom and/or different angles 26. How enticing is the image? Product Reviews 27. Do you use Product Reviews? 28. Do you provide more than just a simple score? Try a more multidimensional scoring system (i.e., quality, support and setup scores). 29. Are there positive and negative reviews, transparency is important? Category Pages 30. Are you accounting for customers early in the buying process when they don’t know the terms associated with the product? 31. Are you making it easy for those who know approximately what they want? 32. How quickly can those who know exactly what they want find their product? Ask the big three questions. About Us Pages 33. Do you tell the story of your company? 34. Are you doing something more than just posting a mission statement? Take advantage of the about us page by letting visitors see the personality and the people behind your company. Contact Us Pages 35. Build more effective ‘Contact Us’ pages. Make it easy for them to find the information they seek. 36. Whom should they contact? 37. How soon can they expect a response? Forms 38. How intimidating are your forms? Too many drop-downs and long text boxes make forms seem intimidating. 39. Remove unnecessary drop-downs. 40. Remove reset buttons. Shopping Cart Tips 41. Check how many steps are in your checkout process. 42. Include a “Progress Indicator” on each checkout page. 43. Provide a link back to the product. 44. Add pictures inside the basket. 45. Provide shipping costs as early in the process as possible. 46. Show stock availability on the product page. 47. Make it obvious what to click next. 48. Make it easy for the shopper to edit her shopping cart. 49. Make it your fault if visitors input something incorrectly. 50. Make shoppers aware that you’re a real entity; give them contact info. 51. Give the visitor the option to call. 52. Make it always about your new customer, do not force a new registration instead make it a part of the checkout. 53. Add 3rd party reinforcement messages. 54. Present coupon codes carefully. 55. Deal with pricing issues head-on, include a price match guarantee. 56. GTC (”Get the Cash) - offer numerous payment options. 57. Offer point-of-action reassurance; make anything the visitor may be concerned about available and visible during checkout. 58. Save it for them. Save their cart in case they want to come back later. 59. When all else fails, survey. If they didn’t complete the checkout, ask them why.
Focus on certain pages that tend to lose more customers. These included landing pages and your homepage – entry points to your conversion funnels. Examine the search terms that direct visitors to these pages, and consider the needs of your personas to determine why visitors might leave these pages, focus especially on copy. Analytics software identify the pages with high bounce rates (high exit rates) and low time spent on page. This information will direct you to problem pages on your website and landing pages.
Hypotheses in Website Optimizer are largely open-ended, so in fact all you can really say is that a variation may or may not be better than the original or another variation in the same test. But, it does no harm to state what you believe to be true, such as “red buttons convert better than blue buttons” so your aim is clear. You could even go further with the hypotheses: Round, large, red buttons convert better than other combinations of button shape, size and colour Button colour matters more for increasing conversion than button shape or size Without precise language you run into the risk of being imprecise in your experiments.
Google’s A/B test – compare multiple page variations (i.e. entirely different pages). Entire page is treated as a variation/section. You test pages not variables. Google’s Multivariate test – compare content variations in multiple locations on one page a.k.a. classic A/B variable and variations test. Split-path test – split traffic among different linear paths, where each path contains multiple pages. You test the performance of grouped pages against other grouped pages. The test is essentially the same as an A/B test: each group of pages is treated as a section, and within that section you have one variation. Each variation has the same goal page. Multipath Multivariate test – compare different sections on multiple pages at the same time, all within the same experiment. A good strategy is to run such tests after a winning combination has been identified from a split-path test. Do Anything test – specify more than one goal, e.g. subscribe to a news letter, fill out a lead-generation form, download a white paper, and/or buy a product. Linger test – content websites don’t have an obvious conversion goal. It can be valuable to test how long a visitor has remained on a page. Click test – you can target a specific event or click, e.g. download link, playback button. http://www.grokdotcom.com/2007/11/02/google-website-optimizer-7-powerful-tests/
A metric will usually be a Key Performance Indicator (KPI) such as conversion rate or average order size . Pages should only differ in one respect of how they perform in terms of their KPI. So if you’re testing button size or colour, don’t change the page copy or images! You can test many pages. An A/B/C/D/E test is still simply an A/B test. We are simply testing one page, ‘B’, against the original, ‘A’.
https://www.google.com/analytics/siteopt/siteopt/help/calculator.html The factors used in this calculator are: Test combinations This value is determined by the number of page sections and variations you've selected. To calculate, multiply the number of variations for each section together. For example, if there are three variations for the first section, and two variations for the second, the number of combinations would be 3 x 2 = 6. Keep in mind that this number can add up fast. If you designate six sections and want to test three variations for each, the number of combinations would be 3 x 3 x 3 x 3 x 3 x 3 = 729! The amount of time it will take to run your experiment is linearly proportional in the number of combinations you have created. Fewer combinations mean less time it will take to run your experiment. Page views This value reflects the average amount of traffic your test page gets per day. All other factors being equal, experiments on pages with higher traffic will achieve results faster. % Visitors This value indicates the percentage of visitors you want to participate in the experiment. A value of "50" would include only half of all visits; in effect, it halves the amount of traffic. The lower you set this value, the longer an experiment will take. Conversion rate This value should be set to the conversion rate the web page has experienced in the past with the original content. The lower the conversion rate of your website, the longer your experiment will take; sites that convert at 2% will take roughly half the time to finish than sites that convert at 1%. Expected improvement This value indicates how much better you expect at least one of the new combinations to perform over the original content. Expected improvement is expressed as a percentage; if the current conversion rate is 15% and you hope to double that to 30%, then you're looking for expected improvement of 100%. If you don't have a figure in mind, keep the default of 10%. Improvement is the single most dominant factor determining duration. A good lift is usually the result of a high-performing variation. If you can think of a significantly better headline, for example, than the original headline, then your experiment will reveal a winner and finish more quickly. Factors: Number of combinations (more combinations to test = more traffic required = longer duration) Conversion rate (higher conversion = less traffic required = shorter duration) Website traffic (higher traffic = shorter duration) Estimated conversion rate lift (greater improvement = the faster statistically significant improvement is confirmed = short duration) Percentage of your traffic’s participation (lower participation = less-effective traffic exposed to the lest = longer duration)
Tests are limited to 1000 variations. Therefore, if you have 3 sections with 10 variations, you have reached your limit: 10 x 10 x 10 = 1000 variations.
Only minor variation in the setup procedure for Multivariate tests, which involves Google AdWords, bizarrely! Once the tests are setup up, the system will validate that they are ready to go live. You’ll also be able to preview the test pages before launch; a good way to also discover whether the scripts have malfunctioned!
Chance to Beat Original The "chance to beat original" value is the probability that, as the experiment progresses, the given combination's mean conversion rate will beat that of the original. Here a combination's performance is only measured against the original, ignoring all other combinations. Among combinations where this probability is high, are good candidates to replace the original combination. Observed Improvement Observed improvement reflects how much better one combination performed over the original given the data that has been collected so far. Observed improvement is expressed as a percentage. For example, if the original combination had a mean conversion rate of 10% and combination A showed a doubling in conversion rate, to a mean of 20%, then combination A performed 100% better than the original. Improvement can be a negative number if, for example, Combination A is experiences a lower conversion rate than the original. Note: When adopting any combination, actual performance will naturally vary somewhat from the observed performance during the experiment; conversion rate lift is particularly volatile given the number of variables involved. Conversions / Visitors Conversions / visitors represent the number of visitors who reached the conversion page after viewing the test page where the given combination was presented. http://www.alkemi.co.nz/images/Alkemi/google_optimiser.png 80% confidence level, which is pretty low by statistics standards. Normally to guarantee a result, you would need to use a 95% confidence level. Chance to beat original uses a 95% confidence level The confidence level describes the uncertainty associated with a sampling method . Suppose we used the same sampling method to select different samples and to compute a different interval estimate for each sample. Some interval estimates would include the true population parameter and some would not. A 90% confidence level means that we would expect 90% of the interval estimates to include the population parameter; A 95% confidence level means that 95% of the intervals would include the parameter; and so on.
If you are using Google’s Multivariate test, you will receive the following additional screen.
Chance to Beat Original The "chance to beat original" value is the probability that, as the experiment progresses, the given combination's mean conversion rate will beat that of the original. Here a combination's performance is only measured against the original, ignoring all other combinations. Among combinations where this probability is high, are good candidates to replace the original combination. Observed Improvement Observed improvement reflects how much better one combination performed over the original given the data that has been collected so far. Observed improvement is expressed as a percentage. For example, if the original combination had a mean conversion rate of 10% and combination A showed a doubling in conversion rate, to a mean of 20%, then combination A performed 100% better than the original. Improvement can be a negative number if, for example, Combination A is experiences a lower conversion rate than the original. Note: When adopting any combination, actual performance will naturally vary somewhat from the observed performance during the experiment; conversion rate lift is particularly volatile given the number of variables involved. Conversions / Visitors Conversions / visitors represent the number of visitors who reached the conversion page after viewing the test page where the given combination was presented.