What is the most important factor in mobile UX?
Performance is perhaps the most important factor for mobile user experience. Your users will literally stop using your service if it does not respond fast enough.
But evaluating and testing the performance of a mobile application is not as straight forward as evaluating and testing the performance of traditional web-based solutions as there are several other variables such as application structure (browser versus native), network used (2G, 3G, 4G, etc.), payload structure, etc.
Mobile Performance Testing consists of three parts:
Part 1 - Client Application performance
Part 2 - Server performance
Part 3 - Network performance
This presentation is from part 1 of a series of 3 webinars in which we will explain these three parts.
It covers how to performance test mobile Client Applications on a mobile device. We will feature 'Angry Birds', explain how to performance test this mobile application and discuss what tools get the best results.
2. Series of 3 Mobile Performance
Webinars
Part 1: Clients applications – Now
Part 2: The Server – Feb 21
Part 3: The Network – March 14
All at 10 AM EST
6. “Proven Results”
“Our products are very complex. We expected a one year ramp up time,
but XBO was able to contribute to our release cycles in just 6 months."
- Development Vice President, Hyperion-Oracle
"The results are very encouraging. Thanks a lot for all the effort put into this
project. We really appreciate your work."
- Manager, Quality Assurance, Autodesk
"Everything you've done is impressive and we are quite pleased, well done!
So keep up the good work."
- QA Director, CA
7. Agenda
• Importance of Mobile Performance Testing
• Types of Mobile Performance Testing
• Mobile Performance Testing for Local (Android) Applications
– Setting up a performance test
– Using different tools to test
• Case study – How fast can Angry Birds run
• Evaluate the tools
• Q&A
8. Importance of Performance
• Mobile internet traffic expected to pass desktop internet
traffic in 2014.
• By 2015, over half of all mobile subscribers are expected to be
engaged in m-payments .
• Amazon reported that the company calculated it lost 1% of
sales for every extra tenth of a second required to load a
page.
• Google experiment: traffic and revenue fell by 20% when the
pages with more results took an extra half second to load.
• Akamai found that the number of users who abandon a page
after 3 seconds was 57%.
- Mobile Site Optimization, Sep 2011, Strangeloop
9. Types of Mobile Performance
1. Client Application / device performance
2. Server performance
3. Network performance
10. Server Performance
Following are some of the challenges, variables, and
issues covered in the Server Performance webinar.
• Amount of data sent,
• Number of communications between Server and
Client/Browser
• Tools
• Optimization Strategies
11. Network Performance
Following are some of the challenges or variables
covered in the Network Performance webinar.
• Latency, jitter, packet loss,…
• 2.5G, 3G, 4G LTE
• CDNs
12. Local Device and App Performance
Why focus on the local device and App?
• Discover performance based on different
hardware/software configurations with the same
application for a defined task.
• Discover which platform meets a set performance
objective with the least cost.
• Determine which platform uses the least resources.
13. Performance is important to
almost every application or task
Buy a product
Obtain a bank balance
Look up a
ticker symbol
Kill a man with Download one page
spear (WoW) of information
Start an application
(game)
15. Performance Test Strategy
Performance test on a few devices, and extrapolate that it will
work on those devices that are more powerful (more memory,
higher CPU Hz, etc…, 5 criteria are defined later).
Different than functional test. It may work functionally on a
certain device, but not meet performance requirements.
Example, for Android, you can test on a well chosen set of 7-15
devices and be confident that your application will work
functionally on the majority of Android devices. However some
of those devices may not meet performance requirements.
16. Defining the test
Product Manager wants to find the lowest phone that
satisfies the performance requirements.
• Requirements:
– Launch Angry Birds in 15 seconds
– Throw an Angry Bird far away in 8 seconds
– Play Angry Birds and play music at the same time
– Play Angry Birds and at the same time delete 100 SMS’s in 5 seconds
Lowest solution that fits requirements
Using tools and manual testing to make purchasing decision
17. Steps for performance test
1. Define task, requirements and test.
2. Choose a sample set of devices.
3. Choose one or more tools and run them.
4. Compare to manual test.
5. Use Quadrants’ DB of devices to eliminate those phones
which do not meet performance requirements.
6. Chose a mobile device based on criteria other than
performance.
19. Tools
• Quadrant – measures CPU, Memory, I/O, 2D
and 3D.
• Linpack - measures CPU floating point
performance by single thread and multi
thread.
• Smartbench – Simple tool to get idea if the
device is suited for example gaming.
• Others: Jbenchmark, SPMark, Nenamark
20. Quadrant Result
Total Score averages the scores of the 5 specific criteria
3500
3000
2500
2000 Lenovo A1
1500 HTC Desire HD
1000
Motorola
500 ME860
0
Total Score CPU Memory I/O 2D 3D
22. Using Linpack
60
50
40
Lenovo A1
30
HTC Desire HD
20 Motorola ME 860
10
0
Single Thread Multi Thread
MFLOPS - Millions of Floating Point Operations per Second
Lenovo A1 HTC Desire HD Motorola ME 860
Single Thread
9.91 39.01 38.94
(MFLOPS)
Multi Thread
8.55 32.51 51.96
(MFLOPS)
23. Using Smartbench
3000
2500
2000
Lenovo A1
1500
HTC Desire HD
1000 Motorola ME860
500
0
Productivity Game
Motorola ME860 >HTC Desire HD > Lenovo A1
25. Manual Test Result - Angry Birds
Scenarios Required Lenovo A1 HTC Desire HD Motorola ME 860
1. Launch Angry Bird
15 seconds 19 seconds 14 seconds 10 seconds
V2.1.1
2. Throw the bird to a
8 seconds 8 seconds 6 seconds 5 seconds
far distance
3. Play Angry Bird
Slight
V2.1.1 and play music at Continuous Continuous Continuous
discontinuity
the same time
4. Play Angry Birds and
delete 100 SMS 5 seconds 4 seconds 3 seconds 2 seconds
simultaneously
26. Results of Manual test
• Lenovo A1 - didn’t meet requirements
• HTC Desire HD - met requirements
• Motorola ME 860 - met requirements
Conclusion
• Since the Lenovo A1 device didn’t meet the performance
requirements, any device with a lesser Quadrant score than
the Lenovo will be rejected.
• Any device with an higher score than the HTC device will be
accepted.
27. Tying it all together
1. Define task, requirements and test
– Angry Birds, certain speed, multi apps
2. Choose a sample set of devices
– Lenovo, HTC, Motorola
3. Choose a tool (0ne or more)
– Quadrant, (Linpack, Smartbench)
4. Compare to manual test
– Find out which device conform to performance requirements
5. Use Quadrants’ DB of devices to eliminate those phones
which do not meet performance requirements.
6. Chose a mobile device based on criteria other than
performance.
28. Thanks
Questions and Answers
www.xbosoft.com services@xbosoft.com
408-350-0508
28
Hinweis der Redaktion
Jan ask : whyAlan: enterprise purchasing, what devices u support
Remind: After upgraded os or firmwire of devices, performance sometimes better than before but sometimes worse than before, end-users should ask or check manufacturer’s info