Untitled Document
You are from : ( )  
Untitled Document
Untitled Document

International Journal of Information Technology & Computer Science ( IJITCS )

Abstract :

Websites are growing in use worldwide but need to be continuously evaluated and monitored to measure their efficiency, effectiveness and user satisfaction, and ultimately to improve quality. For this purpose, heuristic evaluation methodologies, such as Nielsen’s Heuristics, have become the accepted means for the usability evaluation of user interface designs; however, they are general, and unlikely to encompass all usability attributes for all website domains. The aim of this paper is to enhance one of the most-used usability evaluation methods by generating specific heuristics for the educational domain, and then to compare and contrast them against Nielsen’s ten heuristics (as first validation stage for proposed framework) in terms of the number and severity of problems found, and of a number of usability measurements. The result show that the proposed framework succeeded in building a new set of heuristics for online educational websites, which managed to discover uniquely 55 (69%) of the usability problems in all chosen websites (80 problems in total), in comparison with Nielsen’s heuristics, which discovered only 6 (8%). 19 problems (24%) were commonly discovered (overlapping or sharing) by both sets of heuristics. The time taken using Nielsen’s heuristics was less than the time taken using the newly developed Educational Heuristics but this is because Nielsen’s heuristics do not cover all the issues related to educational websites. It appears that the framework for generating context-specific heuristics did in fact produce an efficacious set of Educational Heuristics that covered the issues well in this domain.

Keywords :

: Heuristic evaluation (HE), Framework, Nielsen’s heuristics (NH), Educational Heuristics (EH), Usability problem and Severity rating

References :

  1. Nielsen, J. and Molich, R. (1990). Heuristic evaluation of user interfaces, Proc. ACM HCI’90 (Seattle, WA, 1-5 April 1990), 249-256.
  2. Chen, S. Y. and Macredie, R. D. (2005). The assessment of usability of electronic shopping: A heuristic evaluation, International Journal of Information Management, vol. 25 (6), pp. 516-532.
  3. AlRoobaea, R. Al-Badi, A., Mayhew, P. (2012). Generating Domain-Specific Heuristics for Evaluating Social Network Websites, MASAUM International Conference on Information Technology (MIC-IT'13).
  4. Abuzaid, R. (2010). Bridging the Gap between the E-Learning Environment and E-Resources: A case study in Saudi Arabia. Procedia-Social and Behavioral Sciences, 2(2): 1270-1275.
  5. Ardito, C., Costabile, M., De Angeli, A. and Lanzilotti, R. (2006a). Systematic evaluation of e-learning systems: an experimental validation. In Proceedings of The 4th Nordic Conference on Human-Computer Interaction: changing roles, pp. 195-202. ACM.
  6. Alkhattabi, M., Neagu, D. and Cullen, A. (2010). Information Quality Framework for E-Learning Systems. Knowledge Management & E-Learning: An International Journal (KM&EL), 2(4): 340-362.
  7. Stracke, C. and Hildebrandt, B. (2007). Quality Development and Quality Standards in e-Learning: Adoption, Implementation and Adaptation. In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunication 2007, pp. 4158-4165.
  8. ISO (1998), ISO 9241-11: Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs): Part 11: Guidance on Usability.
  9. Muir, A., Shield, L. and Kukulska-Hulme, A. (2003). The pyramid of usability: A framework for quality course websites. In Proceedings of EDEN 12th Annual Conference of the European Distance Education Network, The Quality Dialogue: Integrating Quality Cultures in Flexible, Distance and e-Learning, Rhodes, Greece, pp. 15-18.
  10. Smith-Atakan, S. (2006), Human-computer interaction. Thomson Learning Emea
  11. Magoulas, G. D., Chen, S. Y. and Papanikolaou, K. A. (2003). Integrating layered and heuristic evaluation for adaptive learning environments. In the proceeding of UM2001, 5-14.
  12. Nielsen, J. (1994). Heuristic evaluation, Usability Inspection Methods, vol. 24, pp. 413.
  13. Hart, J., Ridley, C., Taher, F., Sas, C. and Dix, A. (2008). Exploring the Facebook experience: a new approach to usability. In Proceedings of The 5th Nordic Conference on Human-Computer Interaction: building bridges, pages 471-474. ACM.
  14. Thompson, A. and Kemp, E. (2009). Web 2.0: extending the framework for heuristic evaluation. In Proceedings of The 10th International Conference NZ Chapter of the ACM's Special Interest Group on Human-Computer Interaction, pp. 29-36. ACM.
  15. Squires, D. and Preece, J. (1996). Usability and learning: evaluating the potential of educational software. Computers & Education, 27(1): 15-22.
  16. Reeves, T. C., Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., Kim, H., Lauber, E. and Loh, S. (2002). Usability and instructional design heuristics for e-learning evaluation, In the proceeding of World Conference on Educational Multimedia, Hypermedia and Telecommunications, 1615-162.
  17. Alsumait, A. and Al-Osaimi, A. (2009). Usability heuristics evaluation for child e-learning applications. In Proceedings of The 11th International Conference on Information Integration and Web-based Applications & Services, pp. 425-430. ACM.
  18. Pinelle, D., Wong, N. and Stach, T. (2008). Heuristic evaluation for games: usability principles for video game design, In the proceeding of The twenty-sixth annual SIGCHI conference on human factors in computing systems, 1453-1462. New York, NY, USA,
  19. Chattratichart, J. and Lindgaard, G., (2008), A comparative evaluation of heuristic-based usability inspection methods, In the proceeding of CHI'08 extended abstracts on human factors in computing systems, 2213-2220
  20. Alrobai, A. AlRoobaea, R. Al-Badi, A., Mayhew, P. (2012). Investigating the usability of e-catalogue systems: modified heuristics vs. user testing, Journal of Technology Research.
  21. Tan, W., Liu, D. and Bishu, R. (2009). Web evaluation: Heuristic evaluation vs. user testing, International Journal of Industrial Ergonomics, vol. 39 (4), pp. 621-627.
  22. Nielsen, J. (1992). Finding usability problems through heuristic evaluation. In Proceedings ACM CHI'92 Conference (Monterey, CA, May 3-7), pages 373-380. ACM.
  23. AlRoobaea, R. Al-Badi, A., Mayhew, P. (2012). A Framework for Generating Domain-Specific Heuristics for Evaluating Online Educayional Websites- Further Validation, 2nd International conference on Human Computer Interaction Learning Technology ( ICHCILT 2013).
  24. Bernerus, A. and Zhang, J. (2010). A Peek at the Position of Pedagogical Aspects in Usability Evaluation of E-learning System - A Literature Review of Usability Evaluation of E-learning System conducted since 2000. Report/Department of Applied Information Technology 2010: 085.
  25. Fernandez, A., Insfran, E. and Abrahão, S. (2011). Usability evaluation methods for the web: A systematic mapping study, Information and Software Technology.
  26. Liljegren, E. (2006). Usability in a medical technology context assessment of methods for usability evaluation of medical equipment, International Journal of Industrial Ergonomics, vol. 36 (4), pp. 345-352.
  27. Sears, A. (1997). Heuristic walkthroughs: Finding the problems without the noise, International Journal of Human-Computer Interaction, vol. 9 (3), pp. 213-234.
  28. Khajouei, R., Hasman, A. and Jaspers, M. (2011). Determination of the effectiveness of two methods for usability evaluation using a CPOE medication ordering system, International Journal of Medical Informatics, vol. 80 (5), pp. 341-350.
  29. Masip, L., Granollers, T. and Oliva, M. (2011). A heuristic evaluation experiment to validate the new set of usability heuristics. In Proceedings of The 2011 Eighth International Conference on Information Technology: New Generations, pages 429-434. IEEE Computer Society

Untitled Document
Untitled Document
  Copyright © 2013 IJITCS.  All rights reserved. IISRC® is a registered trademark of IJITCS Properties.