Matthias Heintz, Effie Lai-Chong Law, Stephanie Heintz
Department of Computer Science
University of Leicester
LE1 7RH, Leicester, UK
Abstract: Our online participatory design (OPD) approach to supporting large-scale social requirements engineering (LASSRE) is applying PD asynchronously in a distributed setting. Compared to common paper-based face-to-face approaches, OPD would allow the involvement of more participants as an online tool could reach far more people than an ‘offline’ workshop. Some research efforts have been undertaken to develop tools supporting PD. However, such tools have different qualities and characteristics. A lack of systematic comparisons makes the tool selection difficult. Hence, we have been motivated to carry out a review, focussing on tools that could support asynchronous distributed OPD of software prototypes. Based on the capabilities of common paper-based PD approaches, on a literature review, and on user needs captured with a questionnaire in the context of a European research project with distributed stakeholders, five basic functional requirements regarding access and essential features of an OPD tool have been identified. To further support the selection process, three tools meeting those requirements have been evaluated in more detail applying Heuristic Walkthrough with the ten widely used Nielsen’s usability heuristics. The evaluation results provide insights how to make these tools better for OPD by fixing the usability issues identified and adding desirable features. The tool Webklipper is evaluated to be the most appropriate candidate as online annotation tool for distributed participatory design and therefore discussed in more detail. As one of the major goals of PD is to identify user requirements, which is a main task of LASSRE as well, the tools we reviewed together with their functional requirements and evaluation results discussed, will also be relevant to other LASSRE use cases and activities.
1 INTRODUCTION
Given the broad implications of participatory design (PD), ranging from worker involvement for workplace democratisation to user inclusion in the process of product development, there exist several definitions. Amongst them, we adopt the definition deemed most relevant to our work: “Participatory design—also called cooperative design—is the inclusion of users or user representatives within a development team, such that they actively help in setting design goals and planning prototypes” ( [1], p. 239). The main goal of our work is to support user involvement in the process of developing a web-based interactive system, supporting widely distributed stakeholders’ ideas and opinions to be captured, analysed and eventually incorporated in the design of the system. This resonates with the basic philosophy of large-scale social requirements engineering (LASSRE) to enable representative users to voice their needs and share their views by mitigating the temporal, geographical, technical, and some other constraints facing them. Specifically, in the European research project in which we are involved, a portal for online science laboratories is under development with users being teachers and students of different academic levels, ranging from upper primary school children to university students, across different countries. Obviously, the scale of user involvement is large. A, if not the, means to ensure the realisation of LASSRE is the deployment of a usable and useful online PD tool. In this way, teachers and students, irrespective of their physical locations, can get involved in co-designing the portal. Although the advantages of distributed participatory design (DPD) have been known for some time [1], online tools that enable distributed participants to be actively involved in the actual design process of websites and web applications effectively and efficiently are limited. The existing tools have some drawbacks. A general key issue when using any online tool is the technical barrier which is caused by the basic requirement to use computers and digital artefacts. For instance, it is less natural to draw virtually on a computer screen than to scribble with a pen on paper and the technical infrastructure needs to be available for every participant. Also the knowledge how to operate the digital tools is required. With the required knowledge and infrastructure present, the advantages can outweigh the disadvantages, especially if the PD takes place in a distributed setting. As compared to face to face workshops, online tools are available all the time from any computer with Internet connection. Thus users can participate wherever they are. This advantage is further enhanced if a PD activity can be conducted in an asynchronous matter. A researcher or practitioner can set up a PD activity in which a participant can take part whenever it is convenient; the researcher can then analyse the results later on. Another advantage is that such an asynchronous online PD activity allows much more participants than a comparable offline setting (targeting the ‘large-scale’ part of LASSRE). Exchange and sharing of physical artefacts (e.g. paper with scribbles) created during PD sessions can be expensive and complicated. With online tools, every stakeholder can access the results directly via the Internet. As feedback gathered by an online tool is in digital form, this facilitates the retrieval of raw data as well as the automated derivation of results (e.g. aggregation of comments). Existing DPD tools (e.g. softwiki.de/netzwerk/en/) lack an essential functionality: graphical feedback. Although this is, albeit in various degrees, a very valuable part of many traditional PD techniques (e.g. stickies, paper prototyping, layered elaboration, see e.g. [2] for details), existing DPD solutions still heavily rely only on verbal feedback. The users cannot actively participate in the actual design, e.g. by drawing on the proposed design solution. There have been some attempts in the scientific community to include graphical as well as written feedback in tools for (D)PD. One of them is GABBEH [3], an electronic paper prototyping tool which allows users to comment on the current design. Another one is DisCo [4], an online tool to support intergenerational DPD sessions. However, GABBEH is restricted, as it can only be used in combination with the DENIM tool [5] and DisCo is not yet publicly available. Therefore they are not further considered in our review. For this paper, we aim to evaluate the functionality and usability of existing, publicly available tools that are originally not designed but identified as useful for DPD activities, supporting graphical feedback as well as textual comments. Although it would also be interesting to evaluate tools for synchronous DPD as well and to compare results gathered with both types of tools, we decided to focus on tools which can be used for asynchronous DPD for this paper, given the additional advantages of the asynchronous setting described earlier. 2 FUNCTIONAL REQUIREMENTSBased on the literature review, capabilities of paper-based approaches, our empirical observations in previous PD projects, and the results of a questionnaire capturing user needs and conditions in one of our current projects, we have identified five functional requirements (R) for annotation tools that can support DPD of websites and web applications. Each of them is designated with a code (R1, R2 and so forth) for later references, beginning with the phrase “The online participatory design tool has to …”:
For a tool to be universally applicable, its installation should not be required. Additionally, to maximize the number of potential participants, the technical barrier for the user should be minimized. R1 and R2 address this issue of a low technical barrier. They ensure that the existing software setup in the users’ computers will enable them to participate in DPD activities (assuming that at least one of the three major browsers [6] is installed already). Research [7] showed that the willingness of users to take part in a remote usability test is higher when no change on the client side is required than when users are asked to reconfigure their browser or install a software application. As remote usability tests are similar to DPD activities, these requirements are valid for our tool analysis (R1 and R2). R3 ensures that feedback will be as specific as possible. If a tool enables users to position a comment directly on an interface element of the prototype, users can show the physical location of the target of their comment visually instead of describing the related interface element verbally (e.g. the button at the top right hand corner). Textual description can then be used to explain the desired change and to give the reasoning behind the comment. Nevertheless, as mentioned earlier, written text alone is not sufficient, especially when the redesign of the interface is based on the user feedback. To fully support PD of the interface, users should be enabled to provide graphical or non-textual feedback, justifying the need of R4. A tool for asynchronous DPD activities should be suitable for the early design stages as well as the subsequent phases of the interface development. This way the users will not have to learn a new tool when giving feedback on evolving prototypes with increasing maturity right up to the final product. Feedback should not only be possible on the graphical design but also on interaction, screen-flow, and screen transitions. Hence, not only static images showing the page design, but direct interactions with the web-based prototype in the context need to be supported. These are the reasons for R5. Based on the above mentioned five requirements a Google search for tools and web applications has been conducted with the following search phrases: “webapps annotating websites” (about 322,000 results) and “design feedback on websites” (about 186,000,000 results). Especially with the fast developing world of online applications, where new ones can appear every day, it is not easy to determine when the search process has come to an end. Our approach to address this issue was to check the first ten result pages. As the Google search engine sorts the search results by relevance, we can identify a broad coverage of the most relevant tools. The results are described in the next section. 3 RESULTS OF TOOL SEARCH AND FUNCTIONAL REQUIREMENTS EVALUATION
To make the evaluation process more efficient, further requirements have not necessarily been checked (entry “nc” for not checked in Table 1) if a tool failed one of the five functional requirements (e.g. in most of the cases, if a tool needed installation (R1), the check for basic drawing functionality (R4) has been skipped). If a requirement was not met by a tool, this is indicated in Table 1 by a minus (-). If a requirement was met, we assessed the degree to which the tool fulfilled the requirement, either as basically (o) or extensively (+) (for details see the section about requirement fulfilment by the last three tools). The first two rows in Table 1 are examples for all the tools found which needed to be installed either as a standalone app (Jing) or as a browser plugin (Floatnotes). As R1 and R2 have been used as the key filter criteria to reduce the number of search results to be further evaluated, only two examples not fulfilling them are presented here. For the third row, Drawhere was actually the only tool found that did not support textual feedback (by typing). From all the tools evaluated, the following three have been found to fulfil all five functional requirements extensively or at least basically (they are listed as the last three tools in Table 1):
Figure 1. Appotate (example project offering help on how to use the tool), one of the three tools identified to fulfil all functional requirements (screenshot of Appotate taken and included with the permission of Ayima Limited)
Appotate (see Figure 1) facilitates feedback from different stakeholders when designing and implementing a website. They can use this tool to comment on design ideas or prototypes of various levels of maturity. MarkUp enables its user to comment on websites by offering the functionality to draw and write on it. Scribbles and text can be customized (e.g. by choosing the colour). Webklipper allows users to annotate a document or a website (e.g. with highlights or comments). The results are then stored in a so-called Klip, which might be shared with others. R1 has been fulfilled extensively by all three aforementioned tools. As they all run inside the browser, nothing needs to be installed additionally besides one of the three major browsers. MarkUp has been rated as only basically fulfilling R2, because it needs to be added to the bookmarks toolbar of the browser. But this works fine in all three major browsers tested. The two other tools work without modifications of any kind. R3 is fulfilled extensively by all these tools, because they all offer the functionality to give feedback linked to a specific part of the website, although they all use a slightly different approach. MarkUp tool offers freehand drawing functionality, which is the reason why it extensively fulfils R4, compared to the other two tools, which only offer (very) basic drawing functionality by adding predefined shapes. Regarding R5 all tools support interactive prototypes with a variety of maturity levels by displaying a specified website in the tool environment. For the three tools fulfilling all the functional requirements, it is essential to evaluate them against a key quality attribute – usability, which can thus serve as a critical final selection criterion. Basically, the one with the highest usability should be chosen to be used for DPD activities. 4 RESULTS OF HEURISTIC WALKTHROUGHTo evaluate which one of the tools has the best usability a heuristic walkthrough as described in [8] has been conducted by the first and third author of this paper, who have expertise and experience in usability evaluation for about 3 years. For pass 1 of the walkthrough a short scenario has been developed. It describes a user aiming to remove an element, add a (search) button and move content from one position to another using both graphical and textual feedback. Therefore the task includes the three basic design feedback options ‘remove’, ‘add’ and ‘move’. In pass 2 of the walkthrough the tools were freely explored by the usability researchers independently based on the knowledge gathered from working through the scenario, guided by the ten commonly used usability heuristics developed by Jacob Nielsen [9]. While most of the results were consistent, the discrepant findings were discussed and consolidated between the two researchers. As suggested in [8] and also done in other heuristic evaluations (e.g. [10]), the severity of each identified issue was rated. It was categorized as either low [SR Low], medium [SR Medium], or severe [SR High]. Low issues have only a minor influence on the perceived usability, thus fixing them should have the lowest priority. Medium issues are problems which irritate the users, but for which they can find a workaround. They should be fixed to improve the user experience. Severe issues are major usability problems which need to be fixed urgently. In the following subsections we briefly explain how each of the ten heuristics has been interpreted for the current work and describe the usability problems detected in the three tools together with their severity ratings. A summary can be found in Table 2. 4.1 Visibility of System StatusThe tool should always quickly respond to the user interactions and visualize its status and activities. When adding a comment in Appotate tool the input field for comments just disappears after pressing the save button, thus the user cannot be sure if it has worked correctly [SR Medium]. In MarkUp the highlighting of the currently selected element [SR Medium] and the currently selected button in the tool menu [SR Medium] is dim and thus hard to identify. 4.2 Match between System and the Real WorldThe interface of the tool should be easy to understand by using wording and interaction concepts the users know from the real world. Compared to the broad variety in the real world, drawing functionality is very basic in Appotate [SR Medium] and even more basic in Webklipper [SR High], which both do not support freehand drawing. Additionally the wording in Appotate could sometimes be improved to speak the user language. For instance, the term “turn off comments” is not ideally phrased, “finish editing” would be more appropriate [SR Low]. Another example is “add this page to project” instead of “start commenting on this page”. For users, it is relevant that they can give feedback on an ongoing basis, as everything they are currently working on should automatically belong to the current project. Hence, users should not be bothered with the tools’ internal organization of pages in different projects and adding pages to those projects. This unnecessarily disturbs users in their task to interact with the prototype and give feedback [SR High]. The functionality of “publish” in MarkUp does not perform what the user would expect from its name, and is therefore not expressed in the user’s language in the tool [SR Medium], e.g. “create sharing URL” would be more appropriate. 4.3 User Control and FreedomThe tool should prevent users from getting “lost”. It should support them in their current task (e.g. by offering undo and redo) with minimal restrictions. A general problem when navigating through a project in the Appotate tool is that the user can get lost, because navigation through the pages of a project is not clear [SR High]. Undo and redo actions are not supported by Appotate [SR Low]. Additionally, the Appotate tool has the issue that elements cannot be modified (e.g. size, position) once they have been drawn, they can only be deleted and then redrawn (if the user wants to change the current appearance), which also causes the loss of all comments attached to this element [SR High]. Undo and redo [SR Low] and deleting elements [SR Medium] are supported by MarkUp, but only through shortcuts that have now visual representation in the tool. The user can get stuck in a text-field when entering a comment, because selecting another tool from the menu or pressing Ctrl+Enter instead of just clicking outside the text-field to leave it is not intuitive [SR Medium]. Undo and redo are not supported in the Webklipper tool [SR Low]. User freedom is slightly constrained because elements cannot be created freely anywhere on the page but always appear in the centre of the screen and have to be moved to the desired spot from there [SR Low]. 4.4 Consistency and StandardsThe tool should be consistent both internally and with system and web standards. In Appotate six issues of inconsistency [SR Low] have been found in the user interface. They include different colours and highlighting of buttons, inconsistent representations and visualizations of interactive elements (as links, buttons or interactive icons) and broken interaction patterns for dropdown lists. There are fewer inconsistencies found in MarkUp. One example is using the ‘X’ to visualize the close button like most of the users know it from Windows, but showing it in the left instead of the right hand upper corner which might be unfamiliar to them [SR Low]. Another example is an iPhone-like “unlock”-slider that is used in a web application [SR Low] to publish comments. It can also be confusing that the size slider in the tools menu is used for stroke width as well as text size [SR Low]. In Webklipper the use of closing ‘X’ on the comment dialog is not consistent. For new comments it cancels the creation; For existing ones it minimizes the comment instead [SR Low]. The ordering concept of ‘projects’ and ‘pages in projects’ is also not consistent (sometimes from the newest to oldest and sometimes the other way around) [SR Low]. 4.5 Error PreventionThe tool should prevent errors, for instance, through confirmation dialogs. Appotate even provides too many confirmation dialogs (if you want to add a page to a project), which is not necessary for error prevention but could be rather annoying for the user [SR Low]. On the other hand, the tool does not inform the user about session timeouts, leading to possible further errors [SR Low]. A refresh of the page or closing the window clears all comments created with MarkUp without a warning [SR High]. Feedback can only be accessed afterwards if the sharing URL has been saved, otherwise it is lost because it cannot be retrieved without the sharing URL [SR High]. 4.6 Recognition rather than RecallThe tool should release the user memory by giving additional visual cues, enabling easier recognition of information. Appotate shows only the links and no thumbnails in the overview of pages in a project. When searching for a specific page (in a project), this makes it harder or even impossible for the user to identify the searched page in there [SR High]. Webklipper also has no thumbnails of pages in the page list, but shows at least more clues than Appotate [SR Medium]. Some functions in MarkUp can only be activated using shortcuts. The missing visual representations make interaction more complicated [SR Medium]. MarkUp and Webklipper both show no tooltips describing the functionality of elements, but the icons are rather self-explanatory [SR Low]. 4.7 Flexibility and Efficiency of UseShortcuts (one of the original aspects of this heuristic) are not that important for the feedback scenario as most of the users might use the tools only once, not getting to a state where shortcuts would enhance their user experience, but the tool should support the user in giving feedback on multiple pages (the whole prototype and not only separate pages). No shortcuts are available in Appotate and Webklipper [SR Low]. MarkUp is lacking a project-like structure, which would support the user when working on multiple pages. Instead the feedback on each page will result in a separate sharing URL. As MarkUp does not provide any support to administer them, the user has to take care of that outside the MarkUp tool (e.g. by saving a list of links containing all the sharing URLs in a text document or bookmarking all sharing URLs in the browser) [SR High]. In Webklipper changing the order of the lists on the “My Klips” overview page (‘list of projects’ and ‘list of pages in project’) could improve the workflow and thus efficiency of the task, but this is not possible [SR Low]. Due to a lack of shapes, commenting can be quite inefficient in Webklipper. For example, instead of crossing out something or drawing an arrow to recommend moving of screen elements, workarounds need to be found and used [SR High]. The flexibility of different settings for the shapes (e.g. picking different colours) can be very usable as it supports user freedom, but it might be too much in this case, leading the user to “play around” with them instead of focusing on the task [SR Medium]. All the settings also require a rather large menu which can be in the way when resizing the shape [SR Medium]. 4.8 Aesthetic and Minimalist DesignGraphical user interface design of the tool should be minimalist but still visually appealing. At one point the design of Appotate even gets too minimalistic using the same icon for two different buttons on the same screen (‘go to next page’ and ‘load page’). This might confuse the user [SR Low]. Another issue is that the comment area overlaps with the content and cannot be hidden completely [SR Low]. Info and Publish button in MarkUp have the same functionality [SR Low], one of them is therefore not needed. Textual feedback can be hard to read, because the text field has no (white) background but shows the website behind and in between the letters [SR Low]. The design of Webklipper is not very aesthetically attractive (plain white text-links on blue background) but functional [SR Low]. Social network sharing links are not needed all the time and could be grouped with the ‘options’ to declutter the interface [SR Low]. Additionally the overlap of tool and website content could cause the tool to get in the way when giving feedback. It would be more usable if it could be moved in addition to minimizing it [SR Low], which makes interacting with the tool impossible. 4.9 Help Users Recognize, Diagnose, and Recover from ErrorsThe tool should support the user in case of an error by describing what happened in natural language and offering options what to do, to solve the problem causing it. At one point in interacting with Appotate an error message with the text “Something went wrong adding your comment” appeared. The only possible action for the user was closing the error message using OK- or X-button [SR High]. Trying to access other pages from the main page through the ‘pages in this project’ button in the top menu causes an error when the connection has been lost. However, the error message “Sorry!! No projects have been found matching domain specified” does not help the user [SR High]. No error messages occurred while using MarkUp and Webklipper, so nothing can be said about them. 4.10 Help and DocumentationThe tool should offer substantial help and documentation material. The help and documentation given by Appotate as a default project which describes the tool might not be discovered by the user [SR Low]. The MarkUp website has a huge help section, but it is unfortunately not linked directly from the tool [SR Low]. The user guide of Webklipper is not very specific without pictures illustrating the text, and is not accessible directly from the tool [SR Medium]. 4.11 Software Bugs FoundApart from the usability issues described, some software bugs have been identified while evaluating the tools. Although listing those bugs would be out of the scope of this paper, they affect the usability. Hence, it is deemed justified to point out that Appotate and Webklipper only had a few bugs [SR Low] while many bugs (e.g., by clicking with the line tool selected the user can create dots that cannot be removed) have been found in the MarkUp tool [SR Medium]. 5 CONCLUSION AND FUTURE WORKThe results of the heuristic walkthrough are summarized in Table 2, showing that Webklipper has the best usability of the three tools evaluated (rating based on number of High, then Medium and then Low issues).
When using Webklipper in the context of a project, privacy might currently be an issue, because all feedback can be accessed without the need of a login. But improving the privacy settings has already been announced on the tool homepage (http://webklipper.com/u/guide#annotate). The following additional improvement suggestions, which should be integrated into the Webklipper tool to make it even better suited for using it in DPD activities, can be derived from the results of the heuristic walkthrough:
Based on our evaluation results, Webklipper should be used for the distributed asynchronous online participatory design activities. If freehand drawing is needed in a use case, MarkUp is the option to go for. But currently it is only advisable for small-scale projects, for instance, to annotate a single page, because it does not support multiple pages (administration of projects) and has quite some bugs. If privacy is a primary concern for a project, Appotate should be used until Webklipper improves its privacy settings, because it offers login functionality and its overall functionality is very suitable for DPD. The next step would be a user-based test of the tools to get opinions and feedback from real users, because user testing can identify different and additional usability issues as compared to heuristic evaluation [10]. As many of the tools evaluated in this short paper are still in alpha or beta state (e.g. Appotate), this evaluation could also be repeated in the near future to check whether the improved, more mature tools then meet the given requirements or their rework changes the results of the heuristic walkthrough. Our tool evaluation shows that although not specifically designed for this purpose, there exist tools that can be used for DPD. This could inspire their usage in large-scale social requirements engineering activities. Further research is needed to compare the results of paper-based and tool-supported activities to evaluate if the feedback and requirements gathered with each method are comparable and to what extent tools could support or even replace paper-based methods. Given the advantages of online PD tools described in the introduction, we believe that such tools are an excellent way to support requirements engineering on a larger scale. Nonetheless, further research studies based on the comparison of existing methods and tools and maybe also the development of a new, even more dedicated tool are necessary to validate our assumptions.
Acknowledgements
This work was partially funded by the European Union in the context of the Go-Lab project (Grant Agreement no. 317601) under the Information and Communication Technologies (ICT) theme of the 7th Framework Programme for R&D (FP7). This document does not represent the opinion of the European Union, and the European Union is not responsible for any use that might be made of its content. REFERENCES[1] K. Danielsson, A. M. Naghsh, D. Gumm and A. Warr, "Distributed participatory design," in CHI '08 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, 2008. [2] G. Walsh, E. Foss, J. Yip and A. Druin, "FACIT PD: a framework for analysis and creation of intergenerational techniques for participatory design," in Proc. of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2013. [3] A. M. Naghsh and A. Dearden, "GABBEH: A tool to support collaboration in electronic paper prototyping," in CSCW'04 the ACM Conference on Computer Supported Cooperative Work, Chicago, USA, 2004. [4] G. Walsh, A. Druin, M. L. Guha, E. Bonsignore, E. Foss, J. C. Yip, E. Golub, T. Clegg, Q. Brown, R. Brewer, A. Joshi and R. Brown, "DisCo: a co-design online tool for asynchronous distributed child and adult design partners," in Proc. of the 11th International Conference on Interaction Design and Children, New York, NY, USA, 2012. [5] M. W. Newman, J. Lin, J. I. Hong and J. A. Landay, "DENIM: An informal web site design tool inspired by observations of practice," Human-Computer Interaction, pp. 259-324, 2003. [6] P. Bright, "Internet Explorer 10 almost doubles its users thanks to Windows 7 release.," Condé Nast, 1 April 2013. [Online]. Available: http://arstechnica.com/information-technology/2013/04/internet-explorer-10-almost-doubles-its-users-thanks-to-windows-7-release/.[Accessed 11 Juli 2014]. [7] R. Atterer and A. Schmidt, "Tracking the Interaction of Users with AJAX Applications for Usability Testing," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1347-1350, 2007. [8] A. Sears, "Heuristic walkthroughs: Finding the problems without the noise," International Journal of Human-Computer Interaction, pp. 213-234, 1997. [9] J. Nielsen, "Heuristic evaluation," in Usability Inspection Methods, New York, NY, John Wiley & Sons, 1994. [10] W.-S. Tan, D. Liu and R. Bishu, "Web evaluation: Heuristic evaluation vs. user testing," International Journal of Industrial Ergonomics, pp. 621-627, 2009. Stephanie Heintz is a PhD student in the Department of Computer Science at the University of Leicester, UK. Her research is in HCI with a focus on educational games. In her dissertation she analyses the impact of game genre on the efficiency of using games to teach educational content. Before coming to Leicester, she obtained a MSc. in Computer Science from the Furtwangen University of Applied Science (Germany), where she also worked and taught for two years. |
E-Letter > STCSN E-Letter Vol.2 No.3 >