Untitled Document
   
You are from : ( )  
     
Untitled Document
Untitled Document
 

International Journal of Information Technology & Computer Science ( IJITCS )

Abstract :

Websites are growing in use worldwide but need to be continuously evaluated and monitored to measure their efficiency, effectiveness and user satisfaction, and ultimately to improve quality. For this purpose, heuristic evaluation (such as Nielsen’s heuristics) and usability testing have become the two most widely used methods for measuring a system’s usability from the perspective of expert evaluators and real users of the system. It is recommended that heuristic evaluation be conducted in conjunction with usability testing because they complement each other. However, Nielsen’s heuristics’ are general and not readily applicable to new domains. Also, user testing is expensive and time-consuming.
For these reasons, the researchers have enhanced these evaluation methods by building a framework designed to improve the usability assessment process for websites in any chosen domain, by generating specific set of heuristics, in this case for use with educational domain. It is designed to avoid the drawbacks of having to use both general heuristics and usability testing, and it combines their advantages. Also, it helps researchers to combine feedback from both expert evaluators and potential users in a chosen domain in order to create focused heuristics. This paper aims to conduct a secondary validation stage to validate the proposed framework by conducting usability testing. Also, it is to investigate whether it is essential to conduct usability testing in conjunction with new heuristics. The results of this usability experiment, which are based on usability problems and severity of uncovered problems, are compared with Nielsen’s heuristics and the newly developed heuristics, namely Educational Heuristics. This enables the assessment of their relative efficiency, effectiveness, thoroughness, validity, costs of employing and problem-area identification.
The results show that the proposed framework succeeded in building a new set of heuristics for educational websites, which managed to discover more problems than Nielsen’s heuristics or usability testing. In order to remove most usability problems and to avoid wasting money, well-developed, context-specific heuristics, such as our Educational Heuristics, should be employed. These context-specific heuristics can be successfully created by the framework designed herein.

Keywords :

: Heuristic evaluation (HE);User testing(UT);Framework; Nielsen’s heuristics (NH); Educational Heuristics (EH); Usability problem and Severity rating

References :

  1. 1. Nielsen, J. and Molich, R. (1990). Heuristic evaluation of user interfaces, Proc. ACM CHI’90 (Seattle, WA, 1-5 April 1990), 249-256.
  2. Chen, S. Y. and Macredie, R. D. (2005). The assessment of usability of electronic shopping: A heuristic evaluation, International Journal of Information Management, vol. 25 (6), pp. 516-532.
  3. Oztekin, Z., J. Kong and O. Uysal (2010). UseLearn: A novel checklist and usability evaluation method for eLearning systems by criticality metric analysis. International Journal of Industrial Ergonomics, 40(4):455-469, 2010.
  4. Nielsen, J. (1994), Usability engineering, Morgan Kaufmann.
  5. AlRoobaea, R. Al-Badi, A., Mayhew, P. (2012). A Framework for Generating Domain-Specific Heuristics for Evaluating Online Educational Websites, 2nd International conference on Human Computer Interaction Learning Technology ( ICHCILT 2013).
  6. AlRoobaea, R. Al-Badi, A., Mayhew, P. (2012). Generating Domain-Specific Heuristics for Evaluating
    Social Network Websites, MASAUM International Conference on Information Technology (MIC-IT'13).
  7. Abuzaid, R. (2010). Bridging the Gap between the E-Learning Environment and E-Resources: A case study in Saudi Arabia. Procedia-Social and Behavioral Sciences, 2(2): 1270-1275.
  8. Ardito, C., Costabile, M., De Angeli, A. and Lanzilotti, R. (2006a). Systematic evaluation of e-learning systems: an experimental validation. In Proceedings of The 4th Nordic conference on human-computer interaction: changing roles, pp. 195-202. ACM.
  9. Alkhattabi, M., Neagu, D. and Cullen, A. (2010). Information Quality Framework for E-Learning Systems.
    Knowledge Management & E-Learning: An International Journal (KM&EL), 2(4): 340-362.
  10. Stracke, C. and Hildebrandt, B. (2007). Quality Development and Quality Standards in E-Learning: Adoption, Implementation and Adaptation. In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunication 2007, pp. 4158-4165.
  11. ISO (1998), ISO 9241-11: Ergonomic Requirements for Office Work with Visual Display Terminals
    (VDTs): Part 11: Guidance on Usability.
  12. Muir, A., Shield, L. and Kukulska-Hulme, A. (2003). The pyramid of usability: A framework for qualitycourse websites. In Proceedings of EDEN 12th Annual Conference of the European Distance Education Network, The Quality Dialogue: Integrating Quality Cultures in Flexible, Distance and e-Learning, Rhodes, Greece, pp. 15-18.
  13. Smith-Atakan, S. (2006). Human-computer interaction. Thomson Learning Emea
  14. Magoulas, G. D., Chen, S. Y. and Papanikolaou, K. A. (2003). Integrating layered and heuristic evaluation for adaptive learning environments. In the proceeding of UM2001, 5-14
  15. Dumas, J. and Redish, J. (1999). A practical guide to usability testing. Intellect Ltd.
  16. Van den Haak, M., de Jong, M. and Schellens, P. (2004). Employing think-aloud protocols and constructive interaction to test the usability of online library catalogues: a methodological comparison. Interacting with computers, 16(6): 1153-1170.
  17. Nielsen, J. (1994). Heuristic evaluation, Usability Inspection Methods, vol. 24, pp. 413.
  18. Hart, J., Ridley, C., Taher, F., Sas, C. and Dix, A. (2008). Exploring the Facebook experience: a new approach to usability. In Proceedings of The 5th Nordic conference on human-computer interaction: building bridges, pp. 471-474. ACM.
  19. Thompson, A. and Kemp, E. (2009). Web 2.0: extending the framework for heuristic evaluation. In Proceedings of The 10th International Conference NZ Chapter of the ACM's Special Interest Group on Human-Computer Interaction, pp. 29-36. ACM.
  20. Squires, D. and Preece, J. (1996). Usability and learning: evaluating the potential of educational software.
    Computers & Education, 27(1): 15-22.
  21. Reeves, T. C., Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., Kim, H., Lauber, E. and Loh, S. (2002), Usability and instructional design heuristics for e-learning evaluation, In the proceeding of World Conference on Educational Multimedia, Hypermedia and Telecommunications, 1615-162.
  22. Alsumait, A. and Al-Osaimi, A. (2009). Usability heuristics evaluation for child e-learning applications. In
    Proceedings of The 11th International Conference on Information Integration and Web-based Applications
    & Services, pp. 425-430. ACM.
  23. Pinelle, D., Wong, N. and Stach, T. (2008). Heuristic evaluation for games: usability principles for video game design, In the proceeding of The twenty-sixth annual SIGCHI conference on human factors in computing systems, 1453-1462. New York, NY, USA,
  24. Chattratichart, J. and Lindgaard, G. (2008). A comparative evaluation of heuristic-based usability inspection methods, In the proceeding of CHI'08 extended abstracts on human factors in computing systems, 2213-2220
  25. . Alrobai, A. AlRoobaea, R. Al-Badi, A., Mayhew, P. (2012). Investigating the usability of e-catalogue
    systems: modified heuristics vs. user testing, Journal of Technology Research.
  26. Tan, W., Liu, D. and Bishu, R. (2009). Web evaluation: Heuristic evaluation vs. user testing, International
    Journal of Industrial Ergonomics, vol. 39 (4), pp. 621-627.
  27. Liljegren, E. (2006). Usability in a medical technology context; assessment of methods for usability evaluation of medical equipment, International Journal of Industrial Ergonomics, vol. 36 (4), pp. 345-352.
  28. Doubleday, A., Ryan, M., Springett, M. and Sutcliffe, A. (1997). A comparison of usability techniques for evaluating design. In Proceedings of the 2nd conference on De-signing interactive systems: processes, practices, methods, and techniques, pages 101-110. ACM.
  29. Jeffries, R., Miller, J.R., Wharton, C. & Uyeda, K.M. (1991). User interface evaluation in the real world: A
    comparison of four techniques. Proceedings of ACMCHI’91, pp. 119-124. New York: ACM Press.
  30. Nielsen, J. (1992). Finding usability problems through heuristic evaluation. In Proceedings ACM CHI'92
    Conference (Monterey, CA, May 3-7), pp. 373-380. ACM.
  31. Bernerus, A. and Zhang, J. (2010). A Peek at the Position of Pedagogical Aspects in Usability Evaluation of E-learning System - A Literature Review of Usability Evaluation of E-learning System conducted since
    2000. Report/Department of Applied Information Technology 2010: 085.
  32. Fernandez, A., Insfran, E. and Abrahão, S. (2011). Usability evaluation methods for the web: A systematic
    mapping study, Information and Software Technology.
  33. Sears, A. (1997). Heuristic walkthroughs: Finding the problems without the noise, International Journal of
    Human-Computer Interaction, vol. 9 (3), pp. 213-234.
  34. Khajouei, R., Hasman, A. and Jaspers, M. (2011). Determination of the effectiveness of two methods for usability evaluation using a CPOE medication ordering system, International Journal of Medical Informatics, vol. 80 (5), pp. 341-350.
  35. Masip, L., Granollers, T. and Oliva, M. (2011). A heuristic evaluation experiment to validate the new set of usability heuristics. In Proceedings of The 2011 Eighth International Conference on Information Technology: New Generations, pages 429-434. IEEE Computer Society.

Untitled Document
     
Untitled Document
   
  Copyright © 2013 IJITCS.  All rights reserved. IISRC® is a registered trademark of IJITCS Properties.