Human-Computer Interaction: A. Mintra Ruensuk

Download as pdf or txt
Download as pdf or txt
You are on page 1of 30

ITE 254

Human-Computer Interaction
A. Mintra Ruensuk
[email protected]
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Topics
• Heuristic Evaluation

• Severity Ratings

• Guideline Checking

• Cognitive Walkthrough

• Guideline Scoring

• Action Analysis
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Severity Ratings
• Severity ratings can help prioritize the fixing of usability
problems.

• After evaluation sessions, a complete aggregate list


of usability problems is given/sent to each evaluator.

• Working independently, evaluators assign severity


rating [on scale of 0–4] to each problem (~30 mins.).

• Severity rating of single evaluator is unreliable, mean


of 3–5 evaluators is satisfactory.
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Five-Point Severity Scale


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Order of Criticality
• To explicitly take problem frequency into account,
assign criticality ratings.

• Criticality = Severity Ranking + Frequency Ranking


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Topics
• Heuristic Evaluation

• Severity Ratings

• Guideline Checking

• Cognitive Walkthrough

• Guideline Scoring

• Action Analysis
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Guideline Checking
• Guidelines . . . specific advice about usability
characteristics of an interface.

• An evaluator checks an interface against a detailed


list of specific guidelines and produces a list of 

deviations from the guidelines.

• Where as heuristic evaluation employs 10 broad


principles, guideline checking often involves
dozens (or hundreds) of more specific individual
items on a checklist.
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Example Sets of Guidelines


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Online Guideline Checking Application


• Tested ontestpad.com provides an environment for
creating online checklists (called scripts),which you
can then check through on a tablet (say) for each
interface.

• The data can then be downloaded as a CSV file for


offline analysis.
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Pros and Cons of Guideline Checking


• Pros
• cheap

• intuitive

• usable early in development process

• Cons
• time-consuming

• can be intimidating – often hundreds or thousands of specific


guidelines.
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Topics
• Heuristic Evaluation

• Severity Ratings

• Guideline Checking

• Cognitive Walkthrough

• Guideline Scoring

• Action Analysis
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Cognitive Walkthrough
• Task-oriented walkthrough of interface, imagining
novice users’ thoughts and actions. Focuses
explicitly on learnability.

• Design may be mock-up or working prototype.

• Analogous to structured walkthrough in software


engineering.

• Based on cognitive model (CE+) of human


exploratory learning.
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Exploratory Learning
• Rather than read manual or attend course, users
often prefer to learn new system by “trial and error”
-> exploratory learning
• Start with rough idea of task to be accomplished.

• Explore interface and select most appropriate action.

• Monitor interface reactions.

• Determine what action to take next.


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

The CE+ Model of Exploratory Learning


• Based on psychological studies, the CE+ model describes
exploratory learning behavior in terms of 3 components:

• Problem-Solving Component

• User chooses among alternative actions based on similarity


between the expected consequences of an action and the
current goal.

• After executing selected action, user evaluates system


response and decides whether progress is being made
toward the goal.

• A mismatch results in an undo -> “hill-climbing”.


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

The CE+ Model of Exploratory Learning


• Based on psychological studies, the CE+ model describes
exploratory learning behavior in terms of 3 components:

• Learning Component

• When above evaluation process leads to positive decision,


the action taken is stored in long-term memory as a rule.

• Execution Component

• User first attempts to fire applicable rule matching current


context.

• If none found, problem-solving component is invoked.


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Cognitive Walkthrough Preparation


• Identify user population.

• Define suite of representative tasks.

• Describe or implement interface or prototype.

• Specify correct action sequence(s) for each task


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Cognitive Walkthrough Steps


• For each action in solution path, construct credible
“success” or “failure” story about why user would
or would not select correct action.

• Critique the story to make sure it is believable,


according to four criteria:
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Cognitive Walkthrough Steps


• Will the user be trying to achieve the right effect?

• Will the user know that the correct action is


available?

• Will the user know that the correct action will


achieve the desired effect?

• If the correct action is taken, will the user see that


things are going ok?
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Group Walkthrough
• Performed by mixed team of analysts (designers,
engineers, usability specialist).

• Capture critical information on three group displays


(flip charts, overheads):
• User knowledge (prior to and after action).

• Credible success or failure story.

• Side issues and design changes.

• Perhaps also videotape entire walkthrough.


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Pros and Cons of Cognitive Walkthrough

• Pros
• finds task-oriented problems
• helps define users’ goals and assumptions
• usable early in development process

• Cons
• some training required
• needs task definition methodology
• applies only to ease of learning problems
• time-consuming
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Topics
• Heuristic Evaluation

• Severity Ratings

• Guideline Checking

• Cognitive Walkthrough

• Guideline Scoring

• Action Analysis
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Guideline Scoring
• The interface is scored according to its
conformance against a weighted list of specific
guidelines.

• A total score is produced, representing the degree


to which an interface follows the guidelines.
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Guideline Scoring
• Web Technologies - Checklist Homepage Design /
Usability
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Pros and Cons of Guideline Scoring


• Pros

• cheap

• intuitive

• Cons

• must select and weight guidelines

• guidelines or weightings often domain-dependent


ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Topics
• Heuristic Evaluation

• Severity Ratings

• Guideline Checking

• Cognitive Walkthrough

• Guideline Scoring

• Action Analysis
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Action Analysis
• Quantitative analysis of actions to predict time
skilled user requires to complete tasks, based on
time estimates for typical interface actions.

• Focuses on performance of skilled user (efficiency).

• Two flavours (levels of detail):

• a) Formal or “Keystroke-Level”

• b) Informal or “Back-of-the-Envelope”
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Keystroke-Level Analysis
• Developed from GOMS (Goals, Operators,
Methods, Selection) modeling.

• Extremely detailed, may often predict task duration


to within 20%, but very tedious to carry out.

• Used to estimate performance of high-use systems


(e.g. telephone operator workstations).
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Keystroke-Level Analysis
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Back-of-the-Envelope Action Analysis


• “Back-of-the-Envelope” uses the analogy of sketching out a rough
analysis on the back side of an envelope while somewhere away
from your desk

• List actions required to complete a task (as before), but in much


less detail – at level of explaining to a user: 

“Select Save from the File menu” “Edit the file name”

“Confirm by pressing OK”

• At this level of analysis, every action takes at least 2 to 3


seconds

• Allows quick estimation of expected performance of interface


for particular task.
ITE254: Human-Computer Interaction
Week7: Usability Inspection Methods

Pros and Cons of Action Analysis


• Pros

• predicts efficiency of interface before building it

• Cons

• some training required

• time-consuming

You might also like