Exploring the Usability of Municipal Web Sites: A Comparison Based on Expert Evaluation Results from Four Case Studies
|
|
|
- Anis Nelson
- 10 years ago
- Views:
Transcription
1 Informatica Economică vol. 14, no. 4/ Exploring the Usability of Municipal Web Sites: A Comparison Based on Expert Evaluation Results from Four Case Studies Costin PRIBEANU, Ruxandra-Dora MARINESCU, Dragoş Daniel IORDACHE, Maria Gheorghe-MOISII National Institute for Research and Development in Informatics - ICI Bucharest {pribeanu, doruma, iordache, moise}@ici.ro The usability of public administration web sites is a key quality attribute for the successful implementation of the Information Society. Formative usability evaluation aims at finding and reporting usability problems as early as possible in the development process. The objective of this paper is to present and comparatively analyze the results of an expert usability evaluation of 4 municipality web sites. In order to document usability problems an extended set of heuristics was used that is based on two sources: usability heuristics and ergonomic criteria. The explanatory power of heuristics was supplemented with a set of usability guidelines. The evaluation results revealed that a set of specific tasks with clearly defined goals helps to identify many severe usability problems that occur frequently in the municipality web sites. A typical issue for this category of web sites is the lack of information support for the user. Keywords: Formative Usability Evaluation, User Testing, Expert Evaluation, Heuristic Evaluation, Ergonomic Criteria, Usability Problem, Municipal Web Sites 1 Introduction The quality of public administration systems is a priority for the successful implementation of the Information Society. In this respect, the accessibility and usability of municipal web sites is a key quality attribute that will make the actual use of public information and services easier for citizens. At national level usability is still a research topic only in several research projects such as [1], [11], [12]. There is neither a current practice of usability evaluation before the software release nor experienced usability teams. According to the time it is performed and its objectives, usability evaluation can be formative or summative. Formative usability aims at finding and fixing usability problems early in the development process [22]. This can be carried on by conducting an expert usability evaluation (sometimes termed as usability inspection) and / or by conducting a user testing with a small number of users. In the last case, the evaluation is said to be usercentered, as opposite to an expert-based formative evaluation. The objective of this paper is to present and comparatively analyze the results of an expert usability evaluation of 4 municipality web sites. The expert evaluation method is part of an integrated evaluation methodology that was developed during a research project funded from the Sectorial Plan of the Ministry of Communication and Information Society (MCSI). The methodology was experimented in 2009 when some shortcomings and limitations of the usability inspection method were detected. The main problem was the weak task orientation which resulted in a low reliability and validity of the expert evaluation results. Therefore we conducted another expert evaluation in 2010 based on more specific tasks with clearly defined goals. In order to better understand the specific usability issues we carried on a usability evaluation on four municipality web sites. The rest of this paper is organized as follows. In the next section we will briefly describe some related work in the area of usability evaluation of e-government web sites as well as current concerns regarding the usability evaluation methods. Then we will present the evaluation objectives, method and procedure. In section 4 and 5 we will present and analyze individually and comparatively the usability evaluation results. The paper ends with conclusion and future work in section 6.
2 88 Informatica Economică vol. 14, no. 4/ Related Work The ISO/IEC standard 9126:2001 defined usability as the capability of a software system to be understood, learned, used, and liked by the user when used under specified conditions [14]. According to Scriven, usability evaluation could be formative or summative depending on the purpose and the moment when is done [21]. Formative usability evaluation is performed in an iterative development cycle and aims at finding and fixing usability problems as early as possible. The sooner these problems are identified, the less costly is the effort to fix those [22]. Formative evaluation methods are grouped in two broad categories: inspection methods and user testing. Inspection methods are based on testing the user interface by a small number of experts. Usually this is done by evaluating the user interface against a set of broadly accepted principles. As such, the usability problems reported are anticipated problems (not real). In recent years there is a debate on the reliability and validity of individual usability evaluation methods [6], [10], [15]. While usability inspection methods are faster and cheaper, user testing is more expensive and requires specialized expertise. A general approach to increase confidence in results is to conduct both heuristic evaluation and user testing then to analyze and compare results. In this case, usability problems that were anticipated by the expert evaluation are validated by the user testing results. All formative evaluation methods aim at finding and reporting usability problems. A usability problem was defined by Nielsen [18] as any aspect of the user interface which might create difficulties to the user with respect to an important usability indicator (such as: ease to understand, learn, and use, subjective user satisfaction). Usability problems are ranked for their potential impact into severe, moderate and minor problems. A severe usability problem means that the user is not able to accomplish the task goal or the task ends with a significant loss of data or time. The problem is moderate if they have an important impact on task execution but the user is able to find a solution. A minor usability problem is irritating the user but it doesn t have an important impact on the user s task. Heuristic evaluation provides two kinds of measures: Quantitative: number of usability problems in each category. Qualitative: detailed description of individual usability problems. Nielsen & Molich [20] proposed 10 heuristics for the evaluation of a user interface: visibility of system status, compatibility with the activity, user freedom and control, consistency, error prevention, recognition instead of recall, flexibility, aesthetics and minimalist design, quality of error messages. Bastien and Scapin [4] proposed a set ergonomic criteria consisting of 18 elementary criteria grouped into 8 categories (general principles) and serving for both design and evaluation. A formative evaluation report should be both reliable and useful for designers [10], [15]. Usability problems have to be well classified according to their impact, well explained and documented. An important aspect is the explanatory power of heuristics, principles or guidelines that are used to document the usability problems. In this respect, a wider set of usability prescriptions has an increased explanatory power [19]. Web sites usability is a key concern for the effective use of public administration web sites. Nevertheless, there are relatively few papers in the literature that are targeting the usability of municipal (i.e. local public administration) web sites. Bertot [5] argued that usability, functionality and accessibility are a useful starting point for effective user-centered e-government services. Barnes and Vidgen [3] evaluated online taxation systems and found that five factors are important for the user perception: usability, design, information, trust and empathy. In their study, usability was rated 20% based on an A/E score (actual/ expected ratio). Baker [2] proposed to advance the usability
3 Informatica Economică vol. 14, no. 4/ of e-government through enhanced usability benchmarks while Molina and Toval [17] proposed a more pragmatic user-centered design approach based on integrating usability requirements that can be evaluated at design time. 3 Method and Procedure 3.1 Usability Evaluation Methodology In the afore mentioned project an integrated methodology for formative usability evaluation has been developed that consists of two main components: A usability inspection method that includes a structured set of usability heuristics. A user testing method that is using the think aloud protocol to register the user behavior. This methodology benefits from the complementarities of the two methods and makes it possible to assess the effectiveness and efficiency of the usability inspection method. The methodology was experimented in 2009 when heuristic evaluation was carried on for two municipality web sites. The heuristic evaluations were performed by teams of 3-4 evaluators testing the user interface with some general tasks. Two reliability indicators were calculated: average detection rate and average agreement between any two evaluators [8]. Then a user testing was carried on for one of the two web sites. Three indicators of effectiveness of the heuristic evaluation method were calculated: validity, thoroughness and general efficiency [7]. The analysis of results revealed several weaknesses of the heuristic evaluation method as regarding the reliability and validity [12]. 3.2 Usability Heuristics Set In our methodology we are using an extended set of 24 heuristics which are grouped into six ergonomic criteria: User guidance Work load Adaptability and control. Error management Consistency and standards Compatibility The set is presented in Table 1 and has been created by integrating the ergonomic criteria proposed by Bastien and Scapin [4] with the ten heuristics proposed by Nielsen and Molich [19]. Table 1. The set of usability heuristics User guidance 1 Visibility of system status 2 Prompting 3 Immediate feedback 4 Grouping / distinction by format 5 Grouping / distinction by location 6 Legibility Work load 7 Concision 8 Recognition instead of recall 9 Minimal actions 10 Information density Adaptability and control 11 Flexibility and efficiency of use 12 Experience of the user 13 Explicit user actions 14 User control Error management 15 Error prevention 16 Quality of error messages 17 Error correction Consistency and standards 18 Consistency 19 Compliance with standards and rules 20 Significance of codes Compatibility 21 Compatibility with the user 22 Task compatibility 23 Help and documentation 24 Esthetic design Since both original sources have been validated in several studies we considered that they are useful both as coverage and explanatory power. 3.3 Tasks and Procedure In order to increase the reliability and validity of the usability inspection method we performed an expert evaluation in 2010 for other four municipal web sites based on three clearly defined tasks: T1: to know where and how register for audience. T2: to identify and download the forms
4 90 Informatica Economică vol. 14, no. 4/2010 needed to get a birth certificate for a child and benefit from the state allowance and also where to send the application. T3: to find the date of the next Local Council meeting and the contact person for getting informed on the agenda and also to find and read the minute of the last meeting held in This approach is both efficient and reliable. On the one hand, it is not possible to evaluate the usability of all web pages of a complex web site. It is more pragmatic to concentrate on a set of key tasks performed by the user. On the other hand, a task-based expert evaluation helps to better estimate the severity of usability problems. We used the heuristics and a set of usability guidelines mainly to document, analyze, and report the usability problems. In this respect, the method differs from a typical heuristic evaluation where heuristics are mainly used to identify usability problems. The evaluation was carried on in March 2010 by a team of 4 evaluators. The four web sites belong to city halls of 4 important towns in Romania. The lists of individual usability problems identified by each evaluator were consolidated for each task based on the similar changes criterion [8]. During consolidation phase, duplicates were removed and an agreement on severity was reached. The result of the consolidation phase is a list of unique usability problems based on which following aspects were analyzed: Major usability problems: cause and suggestions to fix them. The nature of usability problems as regarding the ergonomic criterion / guideline not respected. Indicators of reliability: individual and average detection rate, average agreement between any two evaluators. The reliability was estimated based on the average detection rate and average agreement between any two evaluators [7]. 4 Usability Evaluation Results 4.1 Case Study 1 The number of individual usability problems detected by the four evaluators varied from 5 problems (from which 3 major problems) to 12 (2 major problems). On each task, the results of individual evaluation are as follows: T1: 3 problems (2 major), T2: 8 problems (2 major) and T3: 6 problems (3 major). After the collaborative consolidation (removal of duplicates, discarding of false usability problems and agreement on severity) a list of 15 unique problems resulted, as shown in Table 2. Most problems are related to the second and third task. Table 2. List of unique usability problems Tasks Total Major Moderate Minor T T T Total The three major usability problems are related to: There is no link to the institution who receives the application for the state allowance. It was not possible to find out the date of the next Local Council meeting. There is no contact information for the Local Council chair. Table 3 illustrates the usability problems according to the heuristic that was not respected. Most usability problems are related to three heuristics: visibility of system status, minimal actions and task compatibility. However, the major usability problems are related to the last two heuristics in the table. Table 3. Usability problems by heuristic Heuristics Total Major Visibility of system status 4 Grouping / distinction by location 1 Minimal actions 5 Task compatibility 3 2 Help and documentation 2 1 Total 15 3 The individual detection rate varied between 13.33% and 80% with a mean of 45%. The average agreement between any two evaluators was 24.14%.
5 Informatica Economică vol. 14, no. 4/ Case Study 2 The lists of individual usability problems produced by the four evaluators varied from 3 problems (2 major) to 11 problems (7 major). On each task, the results of individual evaluation are as follows: T1: 4 problems (1 major), T2: 17 problems (11 major) and T3: 12 problems (3 major). After the collaborative consolidation a list of 15 unique problems resulted, as shown in Table 4. Table 4. Consolidated list of usability problems Tasks Total Major Moderate Minor T T T Total The four major usability problems are related to the following issues: No link to the institution who receives the application for the state allowance. Not possible to find out the date of the next Local Council meeting. Not possible to read the minute of the last meeting in No contact information for the Local Council chair. Table 5 illustrates the usability problems according to the heuristic that was not respected. Most of unique usability problems (86.67%) are due to three heuristics: minimal actions (40%), task compatibility (26.67%), help and documentation (20%). All major problems are due to the last two heuristics. Table 5. Usability problems by heuristic Heuristics Total Major Visibility of system status 1 Minimal actions 6 Explicit user actions 1 Task compatibility 4 2 Help and documentation 3 2 Total 15 4 The individual detection rate varied between 13.33% and 66.67% with a mean of 37.50%. The average any-two-agreement between evaluators was 19.57%. 4.3 Case Study 3 The lists of individual usability problems produced by the four evaluators varied from 4 problems (3 major) to 13 problems (3 major). On each task, the results of individual evaluation are as follows: T1: 10 problems (5 major), T2: 15 problems (6 major) and T3: 11 problems (3 major). After the collaborative consolidation a list of 15 unique problems resulted, as shown in Table 6. Table 6. List of consolidated usability problems Tasks Total Major Moderate Minor T T T Total The two major usability problems are related to the following aspects: Not possible to find out the agenda of the next Local Council meeting. No contact information for the Local Council chair. Table 7 illustrates the usability problems according to the heuristic that was not respected. Table 7. Usability problems per heuristic Heuristics Total Major Visibility of system status 2 Minimal actions 6 Explicit user actions 1 Information density 1 Significance of codes 1 Task compatibility 3 1 Help and documentation 1 1 Total 15 2 Most usability problems are due to three heuristics: visibility of system status, task compatibility and help / documentation. Most of the unique usability problems (40%) are due to the following heuristics: minimal actions (40%), visibility of system status (13.33%), and task compatibility (20%). Major problems are due to two heuristics: task compatibility and help / documentation. The individual detection rate varied between
6 92 Informatica Economică vol. 14, no. 4/ % and 73.33% with a mean of 45%. The average agreement between any two evaluators was 28.86%. 4.3 Case Study 4 The lists of individual usability problems produced by the four evaluators varied from 3 problems (1 major) to 12 problems (0 major). On each task, the results of individual evaluation are as follows: T1: 11 problems (0 major), T2: 12 problems (3 major) and T3: 13 problems (3 major). After the collaborative consolidation (removal of duplicates, discarding of false usability problems and agreement on severity) a list of 13 unique problems resulted, as shown in Table 8. Table 8. Consolidated list of usability problems Tasks Total Major Moderate Minor T T T Total The major usability problem is related to the lack of contact information for the Local Council chair. Table 9 illustrates the usability problems according to the heuristic that was not respected. Table 9. Usability problems per heuristic Heuristics Total Major Visibility of system status 1 Immediate feedback 1 Legibility 1 Concision 1 Minimal actions 6 Explicit user actions 1 Help and documentation 2 1 Total 13 1 Most usability problems are due to two heuristics: minimal actions and help / documentation. 6 unique usability problems (40%) are due to the heuristic minimal actions. The individual detection rate varied between 6.67% and 66.67% with a mean of 36.67%. The average any-two-agreement between evaluators was 27.02%. 5 Comparative Analysis of Results 5.1 Specific Usability Problems The comparative analysis of case studies enables to infer some conclusion on the web sites usability, specific usability problems and reliability of the expert evaluation method. The situation of usability problems and severity for each task is presented in Table 10. Table 10. Usability problems by task Task Total Major Moderate Minor T T T Total Most usability problems were anticipated for tasks T2 and T3. Most major problems were identified for task T3. There is one web site with no usability problem for task T1. On total, 10 major usability problems were detected that are related to: No link to the institution who receives the application for the state allowance. Not possible to find out the date and / or agenda of the next Local Council meeting. Not possible to read the minute of the last meeting in No contact information for the Local Council chair. These usability problems are specific to municipality web sites and are due to the lack of information that leads to the impossibility to accomplish the task goal. Firstly, few web sites are providing guidance as regarding the administrative procedures, documents and application forms. Secondly, the information is not well organized in that information regarding the same goal is wide spread along the web site. Thirdly, the contact information is rarely complete and updated. 5.2 Usability Problems by Heuristics A synthesis of how usability problems relate to heuristics is presented in Table 11. Most usability problems are related to the minimal actions heuristic (39.67%). 6 out of 23 usability problems are moderate and 17 mi-
7 Informatica Economică vol. 14, no. 4/ nor. This heuristic corresponds to the general ergonomic criteria work load and the usability problems are related to the difficulties to navigate on the web site in order to get the information of interest. In this respect, there are two usability guidelines frequently ignored by the developers: provide content for long pages and provide facilities to find the information on in the web site. Table 11. Usability problems by heuristics Heuristics Total Major Visibility of system status 8 0 Immediate feedback 1 0 Grouping / distinction by location 1 Legibility 1 Concision 1 Minimal actions 23 0 Information density 1 0 Explicit user actions 3 0 Significance of codes 1 Task compatibility 10 6 Help and documentation 8 4 Total Second category of usability problems is related to task compatibility (17.24%). An important number of usability problems (16 out of 58) are related to other two heuristics: visibility of system status and help documentation. The 24 usability heuristics are grouped onto 6 general criteria. In Table 12 a synthesis of usability problems by ergonomic criteria is presented. Most usability problems are due to work load (43.1%), compatibility (31.03%), and user guidance (18.97%). All major problems are related to compatibility which suggests that the web sites are not designed in a usercentered approach. Table 12. Usability problems by ergonomic General ergonomic criteria Total Major User guidance 11 Work load 25 Adaptability and control 3 Consistency and standards 1 Compatibility Total Since we have only tested 3 tasks for each web site it is likely that the number of usability problems is higher and most major usability problems are related to the public services that are available. It seems that the two compatibility heuristics (task compatibility and help / documentation) have to be detailed by a set of specific usability guidelines. 5.3 Reliability of Evaluation Results As regarding the reliability of results, the detection rate is easy to compute but it suffers from the fact that the minimum depends on the number of evaluators (i.e. 25% for 4 evaluators) [7]. In this respect, an individual detection rate should be interpreted in the interval from the minimum to 100%. In the previous study [13] the average detection rate varied between 31.58% and 39.25%. In this study, it varied between 36.67% and 45%. Overall, the average detection rate is low since two of the evaluators have little experience in usability evaluation. In this respect, the increased detection rate in the second study is explained by the gain in experience during the project. According to Hertzum and Jacobsen [8] a better reliability indicator is the average agreement between any two evaluators. In the previous case study the average anytwo-agreement varied between 2.08% and 10.66%. In this study it varied between 19.57% and 28.86%. The high level, as compared with data from literature, is explained by the similar expertise of the evaluators. 6. Conclusion and Future Work This study is a part of the first systematic experimentation of a usability evaluation methodology in Romania. Inherently there are several limitations related to the available resources, the experimental character of evaluation (laboratory), the novelty of the evaluation activity and the evaluation expertise. Molich et al (2004) demonstrated that usability evaluation results depend on the selected tasks, methodology and evaluators [16]. Their comparative studies (CUE 1 in 1998 and CUE 2 in 2004) revealed important differences between the results obtained by spe-
8 94 Informatica Economică vol. 14, no. 4/2010 cialized usability teams at the evaluation of the same commercial web site. This case study leads to similar conclusion: usability evaluation results depend mainly on testing the user interface with a set of tasks having a clearly defined goal and less on the set of heuristics used. On another hand, a wider set of usability heuristics is increasing the explanatory power and usefulness of the evaluation report. As regarding expert evaluation, it is recommended to be carried on with a larger team (5 to 10 evaluators) with a different expertise. For municipal web sites it would be useful to include both developers and specialists working in the target public services. However, is recommended to consolidate the usability problems with 1-2 usability experts. The current situation of web sites usability shows that designers need a collection of well-organized prescriptions organized in a hierarchical structure: ergonomic criteria, heuristics and guidelines. Apart from typical web usability guidelines (home page design, information architecture, and navigation) specific content guidelines should be also provided in order to support users in their tasks. These guidelines should require a description of administrative procedures to be followed by the users that are requesting a public service. In the next future we intend to elaborate on a set of specific usability guidelines for municipality web sites and to integrate them in a software tool in order to increase the efficiency of the evaluation process. Acknowledgement This work was supported by a research project, funded from the Sectorial Research Plan of MCSI. References [1] D.M. Andrei, A.D. Mureşan, Hedonic and pragmatic aspects in the evaluation of mobile phone user, Revista Română de Interacţiune Om-Calculator, 1 (special issue RoCHI 2008), pp , [2] D. Baker, Advancing e-government performance in the United States through enhanced usability benchmarks, Government Information Quaterly, 26, pp , [3] S. Barnes, R. Vidgen, Data triangulation and web quality metrics: a case study in e-government, Information & Management, 43, pp , [4] A. Bastien, D.L. Scapin, Ergonomic criteria for the evaluation of humancomputer interfaces, Technical report No. 156, INRIA, Roquencourt, France, [5] J.C. Bertot, User centered e- government: Challemges and benefits for government web site, Government Information Quaterly, 23, pp , [6] G. Cockton, A. Woolrych, Understanding inspection methods: lessons from an assessment of heuristic evaluation Blandford, A., Vanderdonckt, J., Gray, P.D. (Eds.), Proceedings of People and Computers XV. Springer-Verlag, 2001, [7] H.R. Hartson, T.S. Andre, R.C. Williges, Criteria for evaluating usability evaluation methods International Journal of Human-Computer Interaction, 13, pp , [8] M. Hertzum, N.E. Jacobsen, The evaluator effect: A chilling fact about usability evaluation methods, International Journal of Human-Computer Interaction, 13, pp , [9] K. Hornbaek, Current practice in measuring usability: Challenges to usability studies and research. Int. J. Human Computer Studies, 64, pp , [10] E.T. Hvannberg, E.L.C. Law, M.C. Larusdotir, Heuristic Evaluation: Comparing ways of finding an reporting usability problems, Interacting with Computers, 19, pp , [11] A.M. Guran, G.S. Cojocar, Approaches in automatic usability evaluation. Comparative study, Revista Română de Interacţiune Om-Calculator, 1(1), pp , [12] D.D. Iordache, C. Pribeanu, Comparison of Quantitative and Qualitative Data from
9 Informatica Economică vol. 14, no. 4/ a Formative Usability Evaluation of an Augmented Reality Learning Scenario Informatica Economică Journal, vol. 13, no. 3, pp , [13] D.D. Iordache, R.D. Marinescu, M. Gheorghe-Moisii, C. Pribeanu, A case study in formative evaluation of a local administration web site, Revista Română de Interacţiune Om-Calculator, 3, (special issue RoCHI 2010), pp , [14] ISO :2001 Software Engineering - Software product quality. Part 1: Quality Model. [15] E.C. Law, M.C. Lárusdóttir, M. Norgaard, (Eds), Downstream Utility 2007: The Good, the Bad, and the Utterly Useless Usability Evaluation Feedback. Toulouse, France, November 6th IRIT Press - Toulouse. [16] R. Molich, M.R. Ede, K. Kaasgaard, I. Karyuk, Comparative usability evaluation, Behaviour & Information Technology, vol. 23, no. 1, pp , [17] F. Molina, A. Toval, Integrating usability requirements that can be evaluated in design time into model driven engineering of web information systems, Advances in Engineering Software, vol. 40, , 2009 [18] J. Nielsen, Usability Engineering. Academic Press, New York, [19] J. Nielsen, Enhancing the explanatory power of usability heuristics, Proc. ACM CHI 94, 1994, pp [20] J. Nielsen, R. Molich, Heuristic evaluation of user interfaces, Proc. ACM CHI'90, 1990, pp [21] M. Scriven, Evaluation thesaurus. 4th ed. Newbury Park, CA: Sage Publications, [22] M. Theofanos, W. Quesenbery, Towards the Design of Effective Formative Test Reports, In Journal of Usability Studies, vol 1, no. 1, 2005, pp Costin PRIBEANU received the PhD degree in Economic Informatics from the Academy of Economic Studies in Currently he is a senior researcher at ICI Bucureşti. Costin Pribeanu is the Vice-Chair for conferences of the Romanian HCI group (RoCHI SIGCHI Romania) since His research interests include: task analysis, user interface design, task-based design of user interfaces, design patterns, usability guidelines, usability and accessibility evaluation. He is author / co-author of 4 books, 6 edited books 8 book chapters, 40 journal papers and over 50 conference papers. Ruxandra Dora MARINESCU received a Degree on Economic Informatics, from the Academy of Economic Studies of Bucharest in Currently she is a Senior Analyst at INCDI - ICI Bucureşti. Her research interests include: e-learning, accessibility and usability guidelines, user testing and expert usability evaluation. She is co-author of 3 conference papers. Dragoş Daniel IORDACHE received a Master degree in educational counseling from the University of Bucharest in 2006 and is currently a PhD student in educational sciences. Currently he is a psychologist at ICI Bucureşti. Dragoş Daniel Iordache is a member of the Romanian HCI group (RoCHI SIGCHI Romania) and he served as conference reviewer for the last three editions of the national HCI conference. His research interests include: usability and pedagogical evaluation of educational systems, usability guidelines, user testing and heuristic evaluation. He is author / co-author of 5 journal papers and 15 conference papers.
10 96 Informatica Economică vol. 14, no. 4/2010 Maria GHEORGHE-MOISII received a Master degree in organizational management and human resources from the University Spiru Haret Bucharest in Currently she is a sociologist at ICI Bucureşti. Maria Gheorghe-Moisii is a member of the Romanian HCI group (RoCHI SIG- CHI Romania). Her research interests include: human-computer interaction, usability and accessibility of public administration sites, usability guidelines, user testing and heuristic evaluation. She is co-author of a paper published at RoCHI 2010 conference.
Usability of ERP Error Messages
Usability of ERP Error Messages Mazhar Sadiq * & Antti Pirhonen Department of Computer Science and Information System University of Jyväskylä, Finland * Email: masadiq {at} jyu.fi Abstract Usability of
Outline. Lecture 13: Web Usability. Top Ten Web Design Mistakes. Web Usability Principles Usability Evaluations
Lecture 13: Web Usability Outline Web Usability Principles Usability Evaluations Wendy Liu CSC309F Fall 2007 1 2 What Makes Web Application Development Hard? Target audience can be difficult to define
The USE Project: Bridging the Gap between Usability Evaluation and Software Development
The USE Project: Bridging the Gap between Usability Evaluation and Software Development Als, B. 1, Frøkjær, E. 2, Hornbæk, K. 2, Høegh, R. 1, Jensen, J. 1, Nørgaard, M. 2, Skov, M. 2, Stage, J. 1 & Uldall-Espersen,
The Bucharest Academy of Economic Studies, Romania E-mail: [email protected] E-mail: [email protected]
Paul Pocatilu 1 and Ctlin Boja 2 1) 2) The Bucharest Academy of Economic Studies, Romania E-mail: [email protected] E-mail: [email protected] Abstract The educational process is a complex service which
Measurement Information Model
mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides
EVALUATING THE USABILITY OF ERP SYSTEMS: WHAT CAN CRITICAL INCIDENTS TELL US?
EVALUATING THE USABILITY OF ERP SYSTEMS: WHAT CAN CRITICAL INCIDENTS TELL US? Mari-Klara Oja Bentley University MA, United States [email protected] Wendy Lucas Bentley University MA, United States [email protected]
HCI Report: Website Evaluation
HCI Report: Website Evaluation Rakan Alassaf & Tarek Amr Abstract Usability is defined by ISO/IEC 9241 as the extent to which software products satisfy the users needs in an effective and efficient manner
MEASURING USABILITY OF ICONIC BASED GUIs OF MOBILE EMERGENCY SERVICE SOFTWARE BY USING HCI. Y.Batu Salman, Adem Karahoca
MEASURING USABILITY OF ICONIC BASED GUIs OF MOBILE EMERGENCY SERVICE SOFTWARE BY USING HCI Y.Batu Salman, Adem Karahoca Bahcesehir University, Engineering Faculty, Computer Engineering Department Bahcesehir,
An Iterative Usability Evaluation Procedure for Interactive Online Courses
An Iterative Usability Evaluation Procedure for Interactive Online Courses by Laurie P. Dringus ABSTRACT The Internet and World Wide Web (W3) have afforded distance learners simple links to access information.
Issues in Information Systems Volume 15, Issue II, pp. 270-275, 2014
EMPIRICAL VALIDATION OF AN E-LEARNING COURSEWARE USABILITY MODEL Alex Koohang, Middle Georgia State College, USA, [email protected] Joanna Paliszkiewicz, Warsaw University of Life Sciences, Poland,
Methods of psychological assessment of the effectiveness of educational resources online
Svetlana V. PAZUKHINA Leo Tolstoy Tula State Pedagogical University, Russian Federation, Tula Methods of psychological assessment of the effectiveness of educational resources online Currently accumulated
A Comparative Test of Web Accessibility Evaluation Methods
A Comparative Test of Web Accessibility Evaluation Methods Giorgio Brajnik Dipartimento di Matematica e Informatica Università di Udine Via delle Scienze, 206 33100 Udine Italy [email protected] ABSTRACT
A Framework for Integrating Software Usability into Software Development Process
A Framework for Integrating Software Usability into Software Development Process Hayat Dino AFRICOM Technologies, Addis Ababa, Ethiopia [email protected] Rahel Bekele School of Information Science, Addis
A usability study of a website prototype designed to be responsive to screen size
A usability study of a website prototype designed to be responsive to screen size Sophie Rundgren Interactive Media Design Södertörn University Alfred Nobels allé 7 141 89 Huddinge [email protected]
Usability Metric Framework for Mobile Phone Application
Usability Metric Framework for Mobile Phone Application Azham Hussain Informatics Research Institute University of Salford Greater Manchester M5 4WT United Kingdom Maria Kutar Informatics Research Institute
Project management for cloud computing development
Page 16 Oeconomics of Knowledge, Volume 2, Issue 2, 2Q 2010 Project management for cloud computing development Paul POCATILU, PhD, Associate Professor Department of Economic Informatics Academy of Economic
Screen Design : Navigation, Windows, Controls, Text,
Overview Introduction Fundamentals of GUIs Screen Design : Navigation, Windows, Controls, Text, Evaluating GUI Performance - Methods - Comparison 1 Example: Automotive HMI (CAR IT 03/2013) 64, 68, 69 2
THE EXISTING BARRIERS IN IMPLEMENTING TOTAL QUALITY MANAGEMENT
THE EXISTING BARRIERS IN IMPLEMENTING TOTAL QUALITY MANAGEMENT Sălăgean Horaţiu Cătălin 1, Bâlc Bogdan 2, Gârbacea Răzvan Dimitrie 3 1, 2, 3, Department of Management, FSEGA, UBB, Cluj-Napoca, România
The Role of Information Technology Studies in Software Product Quality Improvement
The Role of Information Technology Studies in Software Product Quality Improvement RUDITE CEVERE, Dr.sc.comp., Professor Faculty of Information Technologies SANDRA SPROGE, Dr.sc.ing., Head of Department
A collaborative approach of Business Intelligence systems
A collaborative approach of Business Intelligence systems Gheorghe MATEI, PhD Romanian Commercial Bank, Bucharest, Romania [email protected] Abstract: To succeed in the context of a global and dynamic
E-LEARNING IN THE ENVIRONMENT OF THE UNIVERSITY UNIFORM INFORMATION SPACE
Computer Modelling and New Technologies, 2011, Vol.15, No.4, 23 27 Transport and Telecommunication Institute, Lomonosov 1, LV-1019, Riga, Latvia E-LEARNING IN THE ENVIRONMENT OF THE UNIVERSITY UNIFORM
Human Computer Interaction
Dr Mark Wright - Informatics HCI Course 2012/13 Human Computer Interaction Dr Mark Wright University of Edinburgh and Edinburgh College of Art Semester 1 2012/2013 Principles and Overview Lecture 1 Key
What Makes Good Research in Software Engineering?
International Journal of Software Tools for Technology Transfer, 2002, vol. 4, no. 1, pp. 1-7. What Makes Good Research in Software Engineering? Mary Shaw School of Computer Science, Carnegie Mellon University,
Interface Design Rules
Interface Design Rules HCI Lecture 10 David Aspinall Informatics, University of Edinburgh 23rd October 2007 Outline Principles and Guidelines Learnability Flexibility Robustness Other Guidelines Golden
Improving Government Websites and Surveys With Usability Testing and User Experience Research
Introduction Improving Government Websites and Surveys With Usability Testing and User Experience Research Jennifer Romano Bergstrom, Jonathan Strohl Fors Marsh Group 1010 N Glebe Rd., Suite 510, Arlington,
Implementing Mobile Virtual Exhibition to Increase Cultural Heritage Visibility
24 Informatica Economică vol. 18, no. 2/2014 Implementing Mobile Virtual Exhibition to Increase Cultural Heritage Visibility Cristian CIUREA 1,2, Alin ZAMFIROIU 1, Alin GROSU 1 1 The Bucharest University
Usability Testing. Usability Testing. There are two major considerations when
18 There are two major considerations when conducting usability testing. The first is to ensure that the best possible method for testing is used. Generally, the best method is to conduct a test where
Advantages and Disadvantages of Quantitative and Qualitative Information Risk Approaches
Chinese Business Review, ISSN 1537-1506 December 2011, Vol. 10, No. 12, 1106-1110 D DAVID PUBLISHING Advantages and Disadvantages of Quantitative and Qualitative Information Risk Approaches Stroie Elena
SPECIFIC ASPECTS OF THE INFLUENCE OF HUMAN RESOURCE MANAGEMENT IN ORDER TO OBTAIN PERFORMANCE IN MULTINATIONAL ORGANIZATIONS
SPECIFIC ASPECTS OF THE INFLUENCE OF HUMAN RESOURCE MANAGEMENT IN ORDER TO OBTAIN PERFORMANCE IN MULTINATIONAL ORGANIZATIONS Silvana Nicoleta Muntean, PhD Candidate, Lucian Blaga University of Sibiu Abstract:
AGILE SOFTWARE DEVELOPMENT A TECHNIQUE
AGILE SOFTWARE DEVELOPMENT A TECHNIQUE Saurav Tiwari 1,Aasheesh Goel 2,Rajeev Sharma 3 1,2 Research Scholar,MCADept.,SRM University,NCRCampus,Modinagar 3 Asst. Prof.,MCADept.,SRM University,NCR Campus
Aspects Related to the Usefulness of a Distance Training Course Having Moodle as Course Management System Support
Aspects Related to the Usefulness of a Distance Training Course Having Moodle as Course Management System Support GABRIEL GORGHIU 1, MIHAI BÎZOI 1, LAURA MONICA GORGHIU 2, ANA-MARIA SUDUC 1 1 Electrical
STAGES FOR THE DEVELOPMENT OF THE AUDIT PROCESSES OF DISTRIBUTED INFORMATICS SYSTEMS 1
STAGES FOR THE DEVELOPMENT OF THE AUDIT PROCESSES OF DISTRIBUTED INFORMATICS SYSTEMS 1 Marius POPA 2 PhD, University Lecturer, Department of Computer Science in Economics, University of Economics, Bucharest,
The «SQALE» Analysis Model An analysis model compliant with the representation condition for assessing the Quality of Software Source Code
The «SQALE» Analysis Model An analysis model compliant with the representation condition for assessing the Quality of Software Source Code Jean-Louis Letouzey DNV IT Global Services Arcueil, France [email protected]
E-vote 2011 Version: 1.0 Testing and Approval Date: 26/10/2009. E-vote 2011. SSA-U Appendix 5 Testing and Approval Project: E-vote 2011
E-vote 2011 SSA-U Appendix 5 Testing and Approval Project: E-vote 2011 Change log Version Date Author Description/changes 0.1 26.10.09 First version Page 1 CONTENT 1. INTRODUCTION 3 2. TESTING PROCESS
User and Client Satisfaction in Agile Development
User and Client Satisfaction in Agile Development Marta Larusdottir 1, Effie Law 2, Åsa Cajander 3 1 School of Computer Science, Reykjavik University, Iceland, Menntavegur 1, 101 Reykjavik 2 Department
Common Industry Format Usability Tests
Proceedings of UPA 98, Usability Professionals Association, Scottsdale, Arizona, 29 June 2 July, 1999 Common Industry Format Usability Tests Nigel Bevan Serco Usability Services 4 Sandy Lane, Teddington,
Massachusetts Department of Elementary and Secondary Education. Professional Development Self- Assessment Guidebook
Massachusetts Department of Elementary and Secondary Education Professional Development Self- Assessment Guidebook For Teacher Professional Development Offerings Modified for use by the District and School
Developing the SMEs Innovative Capacity Using a Big Data Approach
Economy Informatics vol. 14, no. 1/2014 55 Developing the SMEs Innovative Capacity Using a Big Data Approach Alexandra Elena RUSĂNEANU, Victor LAVRIC The Bucharest University of Economic Studies, Romania
Efficiency Criteria in Software Project Management
124 Economy Informatics vol. 13, no. 1/2013 Efficiency Criteria in Software Project Management Cecilia CIOLOCA, Mihai GEORGESCU, Mihai CURTEANU Academy of Economics Studies, Bucharest, Romania [email protected],
CHAPTER_3 SOFTWARE ENGINEERING (PROCESS MODELS)
CHAPTER_3 SOFTWARE ENGINEERING (PROCESS MODELS) Prescriptive Process Model Defines a distinct set of activities, actions, tasks, milestones, and work products that are required to engineer high quality
Usability Evaluation of Universities Websites
1 Jabar M. A., 2Usman Abbas Usman, 3Sidi F. Department of Information System, marzanah@ upm.edu.my 2 Department of Information System, [email protected] 3 Department of Computer Science, fatimah@
The People Process Product Continuum in E-Learning: The E-Learning P3 Model
The People Process Product Continuum in E-Learning: The E-Learning P3 Model Badrul H. Khan Contributing Editor In e-learning, people are involved in the process of creating e-learning materials and making
Heuristic Evaluation Quality Score (HEQS): Defining Heuristic Expertise
Vol. 4, Issue 1, November 2008, pp. 49-59 Heuristic Evaluation Quality Score (HEQS): Defining Heuristic Expertise Shazeeye Kirmani Infosys Technologies Ltd. Electronics City, Hosur Road, Bangalore, India,
INTRODUCING A MODEL FOR SOCIAL IMPACT ASSESSMENT OF PUBLIC ADMINISTRATION REFORM IN ROMANIA * Raluca ANTONIE
INTRODUCING A MODEL FOR SOCIAL IMPACT ASSESSMENT OF PUBLIC ADMINISTRATION REFORM IN ROMANIA * Raluca ANTONIE Raluca ANTONIE Lecturer, Public Administration Department, Faculty of Political, Administrative
Observing and describing the behavior of a subject without influencing it in any way.
HOW TO CHOOSE FROM THE DIFFERENT RESEARCH METHODS* The design is the structure of any scientific work. It gives direction and systematizes the research. The method you choose will affect your results and
Situated Usability Testing for Security Systems
PNNL-20201 Prepared for the U.S. Department of Energy under Contract DE-AC05-76RL01830 Situated Usability Testing for Security Systems FL Greitzer February 2011 DISCLAIMER This report was prepared as an
Designing a Software Test Automation Framework
152 Informatica Economică vol. 18, no. 1/2014 Designing a Software Test Automation Framework Sabina AMARICAI 1, Radu CONSTANTINESCU 2 1 Qualitance, Bucharest 2 Department of Economic Informatics and Cybernetics
EVALUATION OF THE EFFECTIVENESS OF ACCOUNTING INFORMATION SYSTEMS
49 International Journal of Information Science and Technology EVALUATION OF THE EFFECTIVENESS OF ACCOUNTING INFORMATION SYSTEMS H. Sajady, Ph.D. M. Dastgir, Ph.D. Department of Economics and Social Sciences
Screen Design : Navigation, Windows, Controls, Text,
Overview Introduction Fundamentals of GUIs - methods - Some examples Screen : Navigation, Windows, Controls, Text, Evaluating GUI Performance 1 Fundamentals of GUI What kind of application? - Simple or
A model for assessing the quality of e-commerce systems
A model for assessing the quality of e-commerce Antonia Stefani Patras University Department of Mathematics Patras, Rio, GR 26500 [email protected] Michalis Xenos Hellenic Open University School of Science
Internet Banking Integration within the Banking System
Revista Informatica Economică nr. 2(46)/2008 55 Internet Banking Integration within the Banking System Constantin Marian MATEI, Cătălin Ionuţ SILVESTRU Academy of Economic Studies, Bucharest Dragoş Ştefan
Overview of Water Utility Benchmarking Methodologies: From Indicators to Incentives
Overview of Water Utility Benchmarking Methodologies: From Indicators to Incentives Sanford Berg and Julie C Padowski [email protected] Public Utility Research Center (www.purc.ufl.edu) University
Since the 1990s, accountability in higher education has
The Balanced Scorecard Beyond Reports and Rankings More commonly used in the commercial sector, this approach to strategic assessment can be adapted to higher education. by Alice C. Stewart and Julie Carpenter-Hubin
User interface evaluation experiences: A brief comparison between usability and communicability testing
User interface evaluation experiences: A brief comparison between usability and communicability testing Experiências de avaliação de interface de usuário: uma breve comparação entre usabilidade e testes
CRITICAL ANALYSYS OF THE SCRUM PROJECT MANAGEMENT METHODOLOGY
N ft n il Ionel CRITICAL ANALYSYS OF THE SCRUM PROJECT MANAGEMENT METHODOLOGY The Academy of Economic Studies Bucharest, Management Faculty, 6 Romana Square, Sector 1, Bucharest, Management Chair, E-mail:
E-SERVICE QUALITY MANAGEMENT
E-SERVICE QUALITY MANAGEMENT Lorena BATAGAN 1 PhD, Lecturer, Department of Economic Informatics, University of Economics, Bucharest, Romania E-mail: [email protected] Adrian POCOVNICU 2 PhD candidate,
An Enterprise Framework for Evaluating and Improving Software Quality
An Enterprise Framework for Evaluating and Improving Software Quality Abstract Philip Lew [email protected] With the world s economy increasingly driven by software products, there has been a relentless
Methods and Practice of Software Evaluation. The Case of the European Academic Software Award
Methods and Practice of Software Evaluation. The Case of the European Academic Software Award Peter Baumgartner Institute for Interdisciplinary Research and Further Education (IFF) University of Klagenfurt
Application of software product quality international standards through software development life cycle
Central Page 284 of 296 Application of software product quality international standards through software development life cycle Mladen Hosni, Valentina Kirinić Faculty of Organization and Informatics University
ISO/IEC 9126-1 Software Product Quality Model
Why do current systems fail? Standish Group found that 51% of projects failed 31% were partially successful Main causes were poor user requirements: 13.1% Incomplete requirements 12.4% Lack of user involvement
Quantitative Analysis of Desirability in User Experience
Quantitative Analysis of Desirability in User Experience Sisira Adikari School of Information Systems and Accounting University of Canberra Canberra, Australia ACT 2617 Email: [email protected]
DESIGNING FOR THE USER INSTEAD OF YOUR PORTFOLIO
DESIGNING FOR THE USER INSTEAD OF YOUR PORTFOLIO AN INTRODUCTION TO USER EXPERIENCE DESIGN Wade Shearer wadeshearer.com Wade Shearer User Experience Designer and Evangelist Vivint, Omniture, LDS Church,
Human-computer interaction and usability testing: application adoption on B2C Web sites
Volume 14, Number 1, 2012 WIETE 2012 Global Journal of Engineering Education Human-computer interaction and usability testing: application adoption on B2C Web sites Vasilia Peppa, Sarantos Lysikatos &
Business Process Services. White Paper. Improving Efficiency in Business Process Services through User Interface Re-engineering
Business Process Services White Paper Improving Efficiency in Business Process Services through User Interface Re-engineering About the Authors Mahesh Kshirsagar Mahesh has a vast experience of about 24
Performance Dashboards for Universities
Performance Dashboards for Universities MIHAELA MUNTEAN, GHEORGHE SABAU, ANA-RAMONA BOLOGA, TRAIAN SURCEL, ALEXANDRA FLOREA Department of Computer Science Faculty of Economic Cybernetics, Statistics and
Strategic Marketing Planning Audit
Strategic Marketing Planning Audit Violeta Radulescu Lecturer, PhD., Academy of Economic Studies, Bucharest Email: [email protected] Abstract Market-oriented strategic planning is the process of
Semantically Enhanced Web Personalization Approaches and Techniques
Semantically Enhanced Web Personalization Approaches and Techniques Dario Vuljani, Lidia Rovan, Mirta Baranovi Faculty of Electrical Engineering and Computing, University of Zagreb Unska 3, HR-10000 Zagreb,
Analysis of the critical path within a project with WinQSB software
Analysis of the critical path within a project with WinQSB software GURAU MARIAN ANDREI, MELNIC LUCIA VIOLETA Faculty of Engineering and Technological Systems Management, Faculty of Mechanical Engineering
Bad designs. Chapter 1: What is interaction design? Why is this vending machine so bad? Good design. Good and bad design.
Bad designs Chapter 1: What is interaction design? Elevator controls and labels on the bottom row all look the same, so it is easy to push a label by mistake instead of a control button People do not make
Medical Staff Motivation - Essential Condition for Obtaining a High Level of Performance in Hospitals in Romania
Medical Staff Motivation - Essential Condition for Obtaining a High Level of Performance in Hospitals in Romania Adriana ZANFIR PhD Student The Bucharest Academy of Economic Studies [email protected]
Higher Education Management Dashboards
Higher Education Management Dashboards M. Muntean, Gh. Sabau, A.R. Bologa, A. Florea Academy of Economic Studies, Faculty of Economic Cybernetics, Statistics and Informatics, Department of Computer Science,
Methods Commission CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS. 30, rue Pierre Semard, 75009 PARIS
MEHARI 2007 Overview Methods Commission Mehari is a trademark registered by the Clusif CLUB DE LA SECURITE DE L INFORMATION FRANÇAIS 30, rue Pierre Semard, 75009 PARIS Tél.: +33 153 25 08 80 - Fax: +33
SOFTWARE PROCESS IMPROVEMENT AT SYSGENIC
STUDIA UNIV. BABEŞ BOLYAI, INFORMATICA, Volume LIII, Number 1, 2008 SOFTWARE PROCESS IMPROVEMENT AT SYSGENIC DUMITRU RĂDOIU AND MILITON FRENŢIU Abstract. The Capability Maturity Model (CMM) was defined
DUOLINGO USABILITY TEST: MODERATOR S GUIDE
DUOLINGO USABILITY TEST: MODERATOR S GUIDE Contents: Preparation Checklist Introductory Remarks Task Instructions and Post- Task Questions o Task #1: Beginning the onboarding procedure and selecting a
An activity-based analysis of hands-on practice methods
Journal of Computer Assisted Learning (2000) 16, 358-365 An activity-based analysis of hands-on practice methods S. Wiedenbeck, J.A. Zavala & J. Nawyn University of Nebraska Abstract The success of exploration-based
Single Level Drill Down Interactive Visualization Technique for Descriptive Data Mining Results
, pp.33-40 http://dx.doi.org/10.14257/ijgdc.2014.7.4.04 Single Level Drill Down Interactive Visualization Technique for Descriptive Data Mining Results Muzammil Khan, Fida Hussain and Imran Khan Department
