diy mobile usability testing - ia summit 2011

Download DIY mobile usability testing - IA Summit 2011

Post on 15-Nov-2014

12.823 views

Category:

Technology

0 download

Embed Size (px)

DESCRIPTION

Our presentation for the IA Summit 2011 in Denver.

TRANSCRIPT

  • 1. we are ...

2. Bernard, packet core engineer at NSN 3. Beln, interaction designer at Intel 4. usability testinga process that employs people as testingparticipants who are representative of thetarget audience to evaluate the degree towhich a product meets specic usabilitycriteria.Handbook of usability testing 2nd Ed., J. Rubin and D. Chisnell 5. please, stand up everyone 6. sit down if you donthave a US cellphone with a data plan 7. sit down if you dontlike beer 8. sit down if you areabsolutely terried bythe idea of being our test subject 9. why recording? memory aid powerful communication tool 10. actionsreactions 11. dut = mut 12. dut = mut + afecwhere:dut = desktop usability testingmut = mobile usability testingafec = a few extra challenges 13. which phone?which context?which connection? 14. which phone?which context?which connection? 15. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability.html 16. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability.html 17. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability.html 18. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability.html 19. handset usabilityaffects test results 20. remember ... test with participants own phones if not possible, include training and warm-up tasks 21. which phone?which context?which connection? 22. eld vs. labIts Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 23. experts disagreeaccording to our study there was nodifference in the number of problems thatoccurred in the two test settings. Ourhypothesis that more problems would befound in the eld was not supportedUsability Testing of Mobile Applications: A Comparison between Laboratory and Field TestingA. Kaikkonen, T. Kallio, A. Keklinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005 24. experts disagreeevaluations conducted in eld settings canreveal problems not otherwise identiedin laboratory evaluationsIts Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 25. but they all agreeevaluations in the eld (are) morecomplex and time-consumingIts Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 testing in the eld requires double the time in comparison to the laboratory Usability Testing of Mobile Applications: A Comparison between Laboratory and Field Testing A. Kaikkonen, T. Kallio, A. Keklinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005 26. testing in the lab isbetter than no testing 27. remember ... for most software, lab testing is ne if you must do eld testing do it late plan and run pilot tests be prepared (like the Scouts) 28. which phone?which context?which connection? 29. !"#$%& &()"*+,-*.//$0,1*")(2US Mobile Data Market Update Q4 2010 and 2010, Chetan Sharma Technology & Strategy Consultinghttp://www.chetansharma.com/usmarketupdate2010.htm"$$89::;;;

Recommended

View more >