Module Learning Outcomes
Knowledge & Understanding (KU):
Intellectual / Professional skills & abilities (IPSA):
Personal Values Attributes (Global / Cultural awareness, Ethics, Curiosity) (PVA):
Please note that this assessment covers all the learning outcomes.
For this module, summative assessment is by a single in-course assessment, which is worth 100% of the final module mark. This is an Individual assessment that aims to give a holistic overview and practical exposure to understanding interaction design, user needs, user behaviours, interface design and usability inspection practices; specifically, it focuses improving user experience with website(s) via a smart device (mobile or tablet). The learning activities, tasks, criteria and marking scheme have all been aligned to a set of module learning outcomes (please see above), which in turn map onto programme learning outcomes (please refer to your Programme Handbook).
To support evidencing achievement of the learning outcomes for the module, formative activities directly linked to the assessment tasks using a mirrored scenario will occur in a flexible weekly delivery mode so that extensive formative feedback can be provided alongside summative assessment activities.
Module learning will include the use of theory, academic papers, and wider reading of quality information sources. Students will be given written summative feedback for the summative assessment.
The key tasks and activities for this assessment are detailed below along with the criteria you are expected to meet for each. Please ensure you make careful use of both the criteria given here and in the assessment criteria-marking grid to ensure your work addresses the necessary requirements and standards expected for the module; please ask the module tutor if you are unsure about anything.
You will plan and design a range of usability inspections on a set of websites with the aim of evaluating the user experience on either a smartphone or tablet for a specific organisation and two competitors/comparators. This will be covered in more detail below.
The usability inspections will focus on non-user evaluations, so that you are not actually running tests with real users. You are tasked with inspecting and evaluating the three selected websites considering the organisationâs user needs and behaviours, including (and prioritising) key activities that users perform within the chosen organisationâs website which you will use to create a user scenario. You must plan the usability inspections using a range of theoretical principles and interaction guides to support the evaluation. These plans should stem from a clearly established case study for the organisation which is negotiated with the module tutor presenting clear organisational need(s) which necessitates non-user evaluations of user behaviour and interaction. The tutor will check suitability and achievability within the scope of the module assessment; please ensure that you engage with this process.
In summary, you are essentially seeking to evaluate the organisationâs website and two competitors/comparators based on researched user needs from an organisational perspective. A week-by-week indicative schedule (available on the Blackboard Module site) has been produced to support the assessment activities and this must be followed carefully. The assessment deliverables consist of:
A1. Introduction (aims, objectives and high-level description).
This should briefly indicate the reason for carrying out the usability inspections for your selected organisationâs website and two of their competitors/comparators. Start by stating which organisation is being investigated and the initial challenges they face with the current website. Identify the two competitors/comparators stating why they have been sampled as part of this inspection i.e., direct competitor, a site the organisation aspires to be, or a comparator with feature(s) the organisation would like to consider or adopt.
Produce a high-level description of what you hope to achieve from an organisational perspective in the form of a succinctly expressed overarching aim and bulleted set of objectives. The aim should reflect the overall purpose/goal of the evaluation and the objectives should support achieving this aim by focusing on the key activities necessary for the investigation of user behaviour and user interaction on a smartphone or tablet using inspection methods.
Finally, we want you to create a user scenario (included in appendices), this will detail the user tasks which will be used as part of the inspections; these tasks need to be consistent across the organisationâs website (i.e., searching for something, assessing a product or listing, communicating with the vendor, or a combination of these etc.).
A2. Non-user evaluation - User Journey Mapping - Cognitive Walkthrough.
Using the scenario from A1, apply the cognitive walkthrough technique to map out the user journey across the organisationâs website and the two competitors/comparators. The walkthrough should be aimed at evaluating the understandability and ease of learning from your perspective (as an expert evaluator), noting in the report a summary of the types of interactions, pages accessed, time taken to fulfil the scenario, potential interaction errors and comparison of the number of steps etc.
There may be other points you might consider but fundamentally we want you to use these points to reflect on the following questions:
Walking through each of the three sites, try to construct a success story for each step/page based upon the task activities within scenario youâve devised based on user needs. Establish âcommon features of successâ for each site. When a success story cannot be told, construct a failure story, providing the criterion (using one or more of the questions above) and the reason why the user may fail. Using the âcommon features of successâ for each site, create a new User Journey for the chosen organisation.
As you gather data based upon the cognitive walkthrough, use Microsoft Excel to support your data capture and analysis (see B1 below). We will be covering data capture, analysis, and presentation of findings within the workshops using a sample template based upon a class example.
A3. Non-user evaluation â Interface Assessment - Heuristic Evaluation.
This task seeks to identify which interface design elements could be improved to enhance the user experience. As part of this task, you will revisit the cognitive walkthrough steps (and findings) but this time to evaluate the user interface (UI) features in more detail. The heuristic inspection uses the cognitive walkthrough steps and assesses each site in turn evaluating the interfacesâ ability to meet user needs based upon the identified scenario. The results from this should be captured using Microsoft Excel (see B1 below) and the results from this should be used to identify design features that will inform a new interface for the organisation.
Assess each of the three sites and complete the following steps and summarise the findings to inform the report. Use the cognitive walk-through to support the heuristic evaluation of each site i.e., walkthrough the three sites again and this time use Nielsenâs 10 principles noting examples of good and bad design practice.
A4. Applying Information Behaviour Theory
For this task, you should apply the model provided in Appendix 2 to create a set of four tables, as shown in Appendix 3, that indicate what you consider to be usersâ likely information behaviour for the identified scenario. These tables should be included as part of the spreadsheet analysis. Support this with a 500-word discussion of how information behaviour theory can help a designer determine the content of their website during the development and testing phases of the design process.
Please note, Appendix 2 provides a copy of the model that should be applied in this task (application and use of model this will be explained in the lectures) and Appendix 3 contains an example of how the model has been used with a politics application, which could be mobile, or web based.
A5. Recommendations and Sketched UI
Here you should provide recommendations for the chosen organisation as they plan the next stage of development. Use references to findings from the three activities carried out in A1-4 and relevant theories/academic papers/best practice to support the recommendations.
Part B: Spreadsheet Analysis and Results via Microsoft Excel File
B1. Presentation of findings:Â The spreadsheet results and analysis via Microsoft Excel File (.xls file) should include data, analysis, and annotations for both A2, A3 and A4. Document and present the data collected for each of the three websites, including, relevant summaries (i.e., success and failure stories), annotations and/or visualisations.
B2. Comparison of the data/findings from B1: Compare and contrast the results based upon the three websites to identify any performance measures from a quantitative and qualitative perspective i.e., number of steps, perceived errors, clearest visibility, recall and feedback etc. These measures should all be traceable back to the presentation of findings in B1.