HCI Class PDFS PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 914

Color

Marta Gonzalez Carcedo, Soon Hau Chua, Simon Perrault,


Pawel Wozniak, Raj Joshi, Mohammad Obaid,
Morten Fjeld, Shengdong Zhao 1
§ Background: The Problem of Colorblindness and Existing Aids
§ Design Goals and Challenges

§ Research Questions
§ Experiments

§ Conclusion and Future Work

Color 2
Color 3
Hi, I’m Peter.
I’m red-green
colorblind.

Images and characters courtesy of 20th Century Fox


Color 4
Color 5
Color 6
Simulated Colorblind View (Deuteranope)
Normal Color Vision View Simulated Colorblind View (Deuteranope)

These two
socks look
cool!
Color 7
e
th
hey ?
t e
re m
A sa

ha t color
W
ey ?
are th

How
different
they are?

Normal Color Simulated


Vision View Colorblind View

Hapticolor
Color 8
e
th
hey ?
t e
re m
A sa

ha t color
W
ey ?
are th

How
different
they are?

Normal Color Simulated


Vision View Colorblind View

Color 9
Oh, it’s e
green. th
hey ?
t e Haptic vibration
re m
A sa pattern

ha t color
W
ey ?
are th

How
different
they are?

Normal Color Simulated


Vision View Colorblind View

Color 10
e
th
hey ?
t e
re m
A sa

ha t color
W
ey ?
are th

How
different
they are?

Normal Color Simulated


Vision View Colorblind View

Color 11
Oh, it’s e
orange. th
hey ?
t e
re m
A sa

ha t color
W
ey ?
are th

How
different
they are?

Haptic vibration
pattern Normal Color Simulated
Vision View Colorblind View

Color 12
That is
cool, but…
c o lo rs in
e pre sent
to r olor?
How Ha p t ic

How to map the colors onto


the haptic motors?

How to represent color


distance in
Hapticolor?

Color 13
§ Most people are familiar with, even for colorblind people?

§ Represents the important colors in the color space?

§ Suits Hapticolor?

14
15
12 motors
1
12 2
11 3

10 4

9 5

8 6
7 Recognizing each accurately
can be challenging

16
Less motors than colors

17
e Distinguishing
th colors
hey ?
t e
e m
Ar s a

o lor Recognizing
a t c
Wh colors
re t h ey ?
a

How
different Comparing
they are? colors

18
§ Experiment 1 § Experiment 2 § Experiment 3

RQ2:RQ3:
HowCan
to interpolate
colorblind users
valuesusebetween
our 2
RQ1: How many motors can be placed on the wrist
vibration
vibrotactile
motors encodings
using different
solution
vibration
to
without sacrifing its recognition accuracy?
accomplish
dimensions
the threeandtypes
temporalities?
of colour tasks?

Color 19
RQ1: How many
motors can be placed
on the wrist without
sacrifing its
recognition accuracy?

Participant select
on the screen the
motor that runs

Participant with
one of the
wristbands
Color 20
Users can reliably detect
position motors
94% (<=88% accurate) up to 4
88% vibration motors
81%

69%

Color 21
Color 22
RQ2: How to
interpolate values
between 2 vibration
motors using
different vibration
dimensions and
temporalities?

3 Levels
M1

3 Dimensions
Strong
High (H) Medium
Low (L) None
Idle
Duration (d) ↑ ↕ 0
M3 M2
Pulse
Pulses(p)
(p) ↑ ↕ 0

Intensity (I) ↑ ↕ 0

Color 23
- Sequential vibrations are
more accurate and faster than
motors running at same time

- Duration dimension has the


highest accuracy

Color 24
RQ3: Can colorblind
users use our
vibrotactile encodings
solution to
accomplish the three
types of colourblind
tasks?

Color 25
RQ3: Can colorblind
users use our
vibrotactile encodings
solution to
accomplish the three
types of colourblind
tasks?

Are they the Distinguishing


same? colors

M1
What color Recognizing
are they? colors
M3 M2

How
different Comparing
they are? colors

Color 26
Are they the
same?

This is RED Haptic vibration


pattern

M1

M3 M2

M1
M2

M3
Color 27
How different
they are?

Haptic vibration
pattern

M1
M2

M3
Color 28
How different
they are?

Haptic vibration
pattern

M1
M2

M3
Color 29
With HaptiColor Without HaptiColor

Which shirt has


100% 94%
the same color?

Which shirt has 97% 73%


the closest color?

94% 62%
Which shirt has
the farthest color? Color 30
§ Develop and evaluated a spatiotemportal vibrotactile code that allow
recognition of position and distance

§ Of 12 discrete points in a wheel using spatio-temporal code with 3 motors in


a wristband

§ Haptically mapping colors to support colorblind with the color tasks:


distinguigh, recognize and compare colors.

§ Solution is extensible to others applications

Color 31
§ More points in the wheel?
§ Easy to develop the code

§ Use in Navigation tasks


§ showing directions,
§ Cardinal points

Color 32
I am in a new
city and I need
help to move
around!

M1

M3 M2

Color 33
Haptic vibration
pattern

M1

M1
M3 M2
M2

M3

Color 34
§ Develop a haptic code to inform about position and distance

§ Of 12 discrete points in a wheel using spatio-temporal code with 3 motors in


a wristband

§ Haptically mapping colors to support colorblind with the color tasks.

§ Solution is extensible to others applications

Color 36
Color 38
Color 39
40

HaptiColor
§ Affects 250 millions people, mostly male

Red-green
§ Inability to perceive certain colors
colorblind

§ Red-green and blue-yellow colorblind

§ Total color-blindness is very rare

Blue-yellow
colorblind

Color 41
Red-green colorblindness
Color 42
Blue-yellow colorblindness
Color 43
Color 44
Everyday Usage
§ Convenient

§ Non-intrusive

§ Private

§ Familiar

Color 45
§ Distinguishing colors Color A and Color B is different

§ Recognizing colors Color A is red and Color B is green More difficult for
colorblind people

§ Comparing colors Color A is closer to Color B than Color C

Color 46
HaptiColor 47
Decoding information Wearing Socks

“Which bar is chair, and which is cupboard?” “Did I wear the right pair of socks?”
Color 48
Not Recolored Recolored

§ Increase the contrast of colors (Recoloring)


Original

§ Implemented in mobile applications and Google

Glass (Tanuwidjaja et al., 2014)


Simulated

§ Limitation: Need to look at the display (not

convenient)

Color 49
§ Color correcting lenses (Enchroma CX)

§ Filter certain wavelengths of light to enhance the

contrast of confusing colors seen

§ Limitation: Expensive, may affect depth

perception, only works in bright light

Color 50
Enchroma CX
HaptiColor 51
Normal color vision Colorblind vision (simulated)
Color 52
Identify a person

“Look at that guy in the green shirt!”


Color 53
§ Pattern encoding technique (Sajadi et al.)

§ Apply different patterns to different colors so that

the user can recognize them via patterns

§ Limitations: Require large setup and screen, hard

to see on small display

Sajadi et al’s Pattern Technique Color 54


§ ColourID (Flatla et al.)

§ 3 separate techniques: ColourNames, ColourMeters, ColourPopper

§ Limitations: Need to look at the display (not convenient) Color 55


HaptiColor 56
“Can you find me a shirt that goes well with this short?”
No colorblind solutions yet that allows users to compare colors intuitively
Color 57
Everyday Usage
§ Enables users to use it in everyday usage without looking at the display

§ Allows users to distinguish, recognise, and compare colors effectively

§ Relatively easy to learn and unambiguous

§ Incorporates well with existing wearable devices in the market

§ Low technical overhead

Color 58
HaptiColor 59
Color 60
Visual Auditory Haptic

Color 61
Gloves Ring Wristband

Color 62
Haptic motors
Orbicular
Mapping

Color 63
§ RQ1: How many motors can be placed on the wrist without sacrificing its

recognition accuracy? What is the optimal placement?

§ RQ2: How to interpolate values (values not directly on the motors) between

two vibration motors using different vibration dimensions and temporality?

§ RQ3: Can colorblind users use the spatial vibrotactile encodings we devised

to accomplish the three types of colorblind tasks mentioned earlier?

Color 64
Interaction & Interfaces

Lecture Notes
Chapters 3 and 7

1
Course practicalities

• Lecture material: Updated after the lecture


• Joining groups: I need to come back on this
• Physical lectures: Not until it is announced
• Term assignment: Hard to show examples; share papers
• About me: https://www.uib.no/personer/Morten.Fjeld
2
Outline
Chapter 3: Conceptualizing Interaction:
– Understanding the problem space we are solving by design
– Conceptual models

Chapter 7: Interfaces
– Types of interfaces

Research example:
– Example of haptic user interface: HaptiColor

3
Understanding and Conceptualizing
Interaction, Chapter 3
4
We want a product or solution to be

… safe … enjoyable
… accessible
… functional … respect privacy
… usable
... efficient

… these are design goals


To reach these goals, it is critical to understand problem space
– What do you want to create?

– What are your assumptions about users, their tasks, and their knowledge?

– What do you want the product, or solution, to achieve?

6
Asking questions to understanding the problem space

– Are there problems with an existing product or user experience? And why?
– How do you think your proposed design ideas might overcome these?

– If you are designing for a new user experience how do you think your
proposed design ideas support, change, or extend current ways of doing things?

7
From problem space to design space
– Having a good understanding of the problem space can help inform
the design space:
e.g. what kind of interface, behaviour, functionality to provide

– But before deciding upon these it is important to develop a


conceptual model

https://www.alpine-space.eu/projects/desalps/en/about/the-project/design-thinking/design-thinking-process
8
Mental, conceptual, and cognitive models

Source: https://slideplayer.com/slide/9317671/
9
Conceptual model: what

Source: https://uxdesign.cc/understanding-mental-and-
conceptual-models-in-product-design-7d69de3cae26
10
Conceptual model: why?
The purpose of creating a conceptual model — is to get the concepts
and their relationships right, to enable the desired task-flow.

It makes sense to get the concepts and their relationships right


before designing how those concepts will be implemented or
presented.

Start by designing how the user would ideally think about the
application and its use in supporting tasks.

Shape the whole development process a good conceptual model.

Source: https://www.morganclaypool.com/doi/pdf/10.2200/S00391ED1V01Y201111HCI012 11
The first steps in formulating a conceptual model: how
– What will the users be doing when carrying out their tasks?

– How will the system support these?

– What kind of interface metaphor, if any, will be appropriate?

– What kinds of interaction modes and styles to use?


– always keep in mind when making design decisions how the user
will understand the underlying conceptual model

12
Interface metaphors

Metaphors are considered to be a


central component of a conceptual model

A method is offered as part of the user interface, for instance as ...


– Desktop
– Search engine

They provide a structure that is similar in some way to aspects of a familiar


entity (or entities) but also have their own behaviors and properties.
– add to shopping cart / trolley / basket

13
Models of interface metaphors

On Jim Alty:
https://www.interaction-design.org/literature/author/james-l-alty

Alty, J. L., Knott, R. P., Anderson, B., & Smyth, M. (2000). A framework for engineering
metaphor at the user interface. Interacting with computers, 13(2), 301-322.
URL: will be updated
14
Interface metaphors (1/4); book collection

https://www.androidpit.com/shelfie-app-free-e-books-digital-books
15
Interface metaphors (2/4): tangible user interface

16
Interface metaphors (2/4): tangible user interface

Mini assignment A:
Join your assignment group, and brainstorm on
An application making use of one idea from MetaDesk.
• What is the added value over a traditional planning tools?
https://www.youtube.com/watch?v=FsHHYK_UXkw • Can you relate MetaDesk to a metaphor?

17
Interface metaphors (2/4): Sifteo Cubes

https://www.youtube.com/watch?v=dF0NOtctaME

Mini assignment B:
Join your assignment group, and brainstorm on
An application making use of Sifteo Cubes
• What is the added value over traditional user interfaces? (1 min)
• Can you relate Sifteo Cubes to a metaphor? (2 mins)

18
Interfaces, Chapter 7

19
Interfaces

1990s

2000s

2010s

20
27
21
1. Command based
– Commands such as abbreviations
(e.g. “ls –l” or “cp”) typed in at the prompt
to which the system responds
(e.g., listing current files)

– Some are hard wired at keyboard, others


can be assigned to keys

– Efficient, precise, and fast


– Codified
– Strict syntax
– Prone to user error
22
2. WIMP and GUI
Xerox Star first WIMP -> rise to GUIs

Xerox Star 8010, 1981

23
2. WIMP and GUI
What is the meaning of WIMP?

W =
I =
M = menu or mouse
P =

Xerox Star 8010, 1981

24
2. WIMP and GUI

Windows Icon Menu Pointer

25
3. Multimedia
– Combines different media within a single interface with various forms of
interactivity
– graphics, text, video, sound, and animations

– Users click on links in an image or text


-> another part of the program
-> an animation or a video clip is played
-> can return to where they were or move on to another place
26
4. Virtual reality
– Computer-generated graphical simulations providing:
– the illusion of participation in a synthetic environment rather than external
observation of such an environment

– Provide new kinds of experience, enabling users to interact with objects and
navigate in 3D space

– Create highly engaging user experiences

27
4a. Virtual reality - pros vs cons
– Can have a higher level of fidelity with objects they represent compared to
multimedia

– Induces a sense of presence where someone is totally engrossed by the


experience
– a state of consciousness, the (psychological) sense of being in the virtual environment

28
4a. Virtual reality - pros vs cons
– Provides different viewpoints: 1st and 3rd person

– Head-mounted displays are uncomfortable to wear, and


can cause motion sickness and disorientation

29
30
https://www.youtube.com/watch?v=zd5zNo8vQEk&feature=yo
utu.be

Youtube: https://www.youtube.com/watch?v=zd5zNo8vQEk&feature=youtu.be

31
4b. Augmented reality

32
4b. Augmented reality

http://www.cs.unc.edu/~azuma/ARpresence.pdf. 33
5. Information visualization (infoviz) and dashboards
– Computer-generated interactive graphics of complex data

– Amplify human cognition, enabling users to see patterns, trends, and anomalies in
the visualization

– Aim is to enhance discovery, decision-making, and explanation of phenomena

– Techniques include:
– 3D interactive maps that can be zoomed in and out of and which present data
via webs, trees, clusters, scatterplot diagrams, and interconnected nodes

34
https://sites.umiacs.umd.edu/elm/research/current-projects/dataworld/

http://users.umiacs.umd.edu/~elm/projects/vistrates/vistrates.pdf
35
6. Web
– User-centered editing tools (e.g. Dreamweaver) and programming languages (e.g.
php, Flash, asp, XML) emerged in the early 2000s

– HTML5, Ajax

– Wikis, blogs

– WordPress, Joomla, Drupal

– In the late 2000s, need to think of how to design information for multi-platforms -
keyboard or touch e.g. smartphones, tablets, PCs
36
7. Consumer electronics and appliances
– Everyday devices in home, public place, or car and personal devices

– Used for short periods


– e.g., putting the washing on, watching a program, buying a ticket, changing the
time, taking a snapshot

– Need to be usable with minimal, if any, learning


37
8. Mobile
– Have become pervasive, increasingly used in all aspects of everyday and working
life

– Handheld devices intended to be used while on the move

– Apps running on mobiles have greatly expanded, e.g.


– used in restaurants to take orders
– car rentals to check in car returns
– supermarkets for checking stock
– in the streets for multi-user gaming
– in education to support life-long learning

38
8. Mobile - challenges
– Smaller screens, small number of physical keys and restricted number of controls

– Usability and preference varies


– depends on the dexterity and commitment of the user

– Smartphones overcome mobile physical constraints through using multi-touch


displays

39
8. Mobile - design issues
– Mobile interfaces can be tricky and cumbersome to use for
those with poor manual dexterity

– Key concern is hit area (related to so-called “fat finger” problem)

– area on the phone display that the user touches to make


something happen, such as a key, an icon, a button or an app

– space needs to be big enough for fat fingers to accurately


press

– if too small the user may accidentally press the wrong key

40
9. Speech
– Where a person talks with a system that has a spoken language application, e.g.
timetable, travel planner

– Used most for inquiring about very specific information, e.g. flight times or to
perform a transaction, e.g. buy a ticket

– Privacy in the use of speech-based UIs is an issue

– Also used by people with disabilities


– e.g. speech recognition word processors,
– web readers,
– home control systems
41
10. Pen
– Enable people to write, draw, select, and move objects at
an interface using light pens or styluses

– capitalize on the well-trained drawing skills


developed from childhood

– Digital pens, e.g. Anoto, use a combination of ordinary


ink pen with digital camera that digitally records
everything written with the pen on special paper

Source: https://beatsigner.com/interactive-paper.html

42
11. Touch
– Touch screens, such as walk-up kiosks, detect the presence and
location of a person s touch on the display

– Multi-touch support a range of more dynamic finger tip actions, e.g.


swiping, flicking, pinching, pushing and tapping

– Now used for many kinds of displays, such as Smartphones, iPods,


tablets and tabletops
43
12. Air-based gestures
– Uses camera recognition, sensor and computer vision techniques
– can recognize people s body, arm and hand gestures in a room
– systems include Eye Toy, Kinect, xBox, Wii

44
12. Air-based gestures
– Movements are mapped onto a variety of gaming motions, such as swinging,
bowling, hitting and punching

– Players represented on the screen as avatars doing same actions

– Gives haptic feedback (e.g. vibrations)

45
13. Haptic
– Tactile feedback
– applying vibration and forces to a person’s body, using actuators that are
embedded in their clothing or a device they are carrying, such as a smartphone

– Can enrich user experience or nudge them to correct error

– Can also be used to simulate the sense of touch between remote people who want
to communicate

– Vibrotactile feedback (feeling sound)

46
13. Haptic - realtime vibrotactile feedback

47
14. Multimodal
– Meant to provide enriched and complex user experiences
– multiplying how information is experienced and detected using different
modalities, i.e. touch, sight, sound, speech

– support more flexible, efficient, and expressive


means of human–computer interaction

– Most common is speech and vision

48
15. Shareable
– Shareable interfaces are designed for more than one person to use
– provide multiple inputs and sometimes allow simultaneous input by co-located
groups

– e.g. DiamondTouch, Smart Table and Surface


49
15. Shareable - advantages
– Provide a large interactional space that can support flexible group working

– Can be used by multiple users


– Can point to and touch information being displayed
– Simultaneously view the interactions and have same shared point of reference
as others

– Can support more equitable participation compared with groups using a single PC

50
16. Tangible
– Type of sensor-based interaction, where physical objects, e.g., bricks, are coupled
with digital representations

– When a person manipulates the physical object/s it causes a digital effect to occur,
e.g. an animation

– Digital effects can take place in a number of media and places or can be embedded
in the physical object

51
16. Tangible - examples
– Flow Blocks
– depict changing numbers and lights embedded
in the blocks
– vary depending on how they are connected
together

– Urp
– physical models of buildings moved around on
tabletop
– used in combination with tokens for wind and
shadows -> digital shadows surrounding them to
change over time

52
16. Tangible - benefits
– Can be held in both hands and combined and manipulated in ways not possible
using other interfaces
– allows for more than one person to explore the interface together
– objects can be placed on top of each other, beside each other, and inside each
other
– encourages different ways of representing and exploring a problem space

– People are able to see and understand situations differently


– can lead to greater insight, learning, and problem-solving than with other kinds
of interfaces
– can facilitate creativity and reflection

53
17. Augmented and mixed reality - examples
– In medicine
– virtual objects, e.g. X-rays and scans, are overlaid
on part of a patient s body
– aid the physician s understanding of what is being
examined or operated

– In air traffic control


– dynamic information about aircraft overlaid on a
video screen showing the real planes, etc. landing,
taking off, and taxiing
– Helps identify planes difficult to make out

54
18. Wearables
– First developments were head- and eyewear-mounted cameras that enabled user
to record what was seen and to access digital information

– Since, jewellery, head-mounted caps, smart fabrics, glasses, shoes, and jackets have
all been used
– provide user with a means of interacting with digital information while on the
move

– Applications include automatic diaries, tour guides, cycle indicators and fashion
clothing

55
19. Robots and drones
– A machine designed to carry out a certain tasks automatically

– 4 types of robot
– remote robots used in hazardous settings
– domestic robots helping around the house
– pet robots as human companions
– sociable robots that work collaboratively with
humans, and communicate and socialize with them
- as if they were our peers

56
19. Robots and drones - drones
– Unmanned aircraft that are controlled remotely and used in a number of contexts
– e.g. entertainment, such as carrying drinks and food to people at festivals and
parties;
– agricultural applications, such as flying them over vineyards and fields to collect
data that is useful to farmers
– helping to track poachers in wildlife parks in Africa

57
20. Brain-computer Interface (BCI)
– provides a communication pathway between a person s brain waves and an
external device, such as a cursor on a screen

– Person is trained to concentrate on the task,


e.g. moving the cursor

– BCIs work through detecting changes in the


neural functioning in the brain

– BCIs apps:
– Games
– enable people who are paralysed to control robots

58
Cognitive Aspects - I

Week 4

Lecture Notes

1
These slides may be better for
self study.

2
Data Gathering & Data Analysis
Ch 8, 9, 10, 11

Lecture Notes

3
Establishing Requirements

4
What is a requirement?
– A statement about an intended product that specifies what it should do or how to do it

– It must be specific, unambiguous and clear

– E.g, ”a specific button must enable printing of the contents of the current screen”

– It helps us move from problem space to design space

5
What is a requirement?
What needs to be achieved?
– Understand as much as possible about users, task, context
– Produce a stable set of requirements

How can this be done?


– Data gathering activities
– Data analysis activities
– Expression as ‘requirements’
– Iterative process
– new: Evaluation

6
Why requirements are important?

Getting requirements right is crucial


7
Why requirements are important?

Getting requirements right is crucial


8
Who are the users (or: end-users)?
– Those who;
– interact directly with the system
– who receive output from the system
– who make the purchasing decision
– who use competitors’ system

– 3 main types of user (or: end-user)


– Novice
– Knowledgeable / intermittent user
– Expert / frequent user

9
Dreyfus model of skill acquisition (1986):

10
Dreyfus model of skill acquisition (1986):

11
Who are the stakeholders?
– anyone involved, affected, or influenced by what you are designing.

12
Functional requirements …
Describe what a system must do (its functionality)

Include:
– Descriptions of the processing that system will be required to carry out
– interfaces with users and other systems (Details of inputs and outputs)
– what the system must hold data about

“A system must send an email whenever a certain condition is met (e.g. an order is
placed, a customer signs up, etc).”

Modelled with Use Case Diagrams, e.g. UML.

13
… functional requirements; use case diagrams

UML sequence diagram of " Ticket vending " use-case


https://www.researchgate.net/publication/220375312_Using_the_Railway_Mobile_Terminals_in_the_Process_of_Validation_and_Vending_Tickets/figures?lo=1
14
Non-functional requirements
Concerned with how well the system performs

Include:
– Performance criteria such as desired response times for updating data or retrieving data
from the system
– Ability of the system to cope with a high level of simultaneous access by many users
– Availability of the system with minimum of downtime
– Time taken to recover from a system failure
– Anticipated volumes of data in terms of transaction throughput
– Security considerations such as resistance to attacks, ability to detect attack
– Usability and user experience goals

“Emails should be sent with a latency of no greater than 12 hours from such an activity.”

15
Data Gathering

16
Data gathering methods

Common data collection methods;


– Interviews
– Questionnaires
– Observation
– Web analytics

17
Interviews
– Interviews can be thought of as a “conversation with a purpose”.

– How like an ordinary conversation the interview can be depends on the type of
interview method used.

– There are four main types of interviews:


– unstructured
– structured
– semi-structured
– group interviews

18
Unstructured Structured Semi - structured
– some general topics, – predetermined – combinations of
but no predetermined questions for each user structured and
specific questions unstructured interviews
– questions need to be
– to gather rich data short and clearly – has a basic script for
about users’ worded guidance
experiences
– useful when the goals – starts with preplanned
– have a plan of the are clearly understood questions and then
main topics to be
probes the user
covered

19
Focus group!
– It’s a group interview

– 3-10 people are involved

– It allows diverse or sensitive issues

– It aims to enable people to put forward their own opinions in a supportive


environment

– The facilitator guides and prompts discussion;


the discussion is usually recorded for later analysis

20
Running the interview
– Introduction – introduce yourself, explain the goals of the interview, reassure
about the ethical issues, ask to record, present the informed consent form.

– Warm-up – make first questions easy and non-threatening.

– Main body – present questions in a logical order

– A cool-off period – include a few easy questions to defuse tension at the end

– Debriefing – thank interviewee, signal the end, e.g. switch recorder off.

21
Questionnaires
Aims to obtain the views of a large number of people in a way that can be analyzed
statistically

Includes:
– postal, web-based and email questionnaires
– open-ended and closed questions
– gathering opinion as well as facts

22
Questionnaire design (e.g. Google tools)
– The impact of a question can be influenced by question order (may use balanced order)

– You may need different versions of the questionnaire for different populations

– Provide clear instructions on how to complete the questionnaire

– Strike a balance between using white space and keeping the questionnaire compact

– Avoid very long questionnaires

– Cluster into groups of questions

23
Question and response format
Closed questions Open questions
– ‘Yes’ and ‘No’ checkboxes
– Checkboxes that offer many options
– Rating scales
– Likert scales (3, 5, 7 or more points)

24
Question and response format:
Likert Scale
A Likert scale is
a psychometric scale commonly
involved in research that
employs questionnaires. It is the
most widely used approach to scaling
responses in survey research, such
that the term (or more accurately
the Likert-type scale) is often used
interchangeably with rating scale,
although there are other types of
rating scales.

25
Web analytics
– A system of tools and techniques for optimizing web usage by:
– Measuring
– Collecting
– Analyzing
– Reporting web data

– Typically focus on the number of web


visitors and page views.

26
Web analytics

27
Web analytics

28
Web analytics

29
30
Why requirements are important?

After 10 years of Skyss ticketing system:


Major re-design

31
Why requirements are important?

After 10 years of Skyss ticketing system:


Major re-design:

• Choose 3 techniques for your mini-project


• Argue for “why”; “good for”
• Which kind of data
• Pros and cons
32
Informed consent
– For research involving human participants an informed consent will establish a
professional and responsible relationship between researcher and participant

Consent = permission to study/observe person


Informed = providing person/participant with information about
the study’s purpose

– Participants usually wants reassurance that the information provided will not be
used for other purposes, or in any context that would be detrimental to the
participant

33
Privacy
– Are you collecting data that may identify your participant?

– Personal data is information that can be used to identify individuals:


– directly through name, personal identification number, or other unique
personal characteristics
– indirectly through a combination of background information such as place of
residence or institutional affiliation, combined with data on age, gender,
occupation, diagnosis, etc.
– by being traceable to an e-mail or IP address (e.g. online surveys)

– http://www.nsd.uib.no/personvernombud/en/notify/index.html

34
Data Analysis

35
Quantitative data
– It’s about testing or proving something with a large sample size.

– Large sample size

– Questionnaires

– Collection is done by measuring things

– Assumes a fixed and measurable reality

36
Qualitative data
– It’s about discovering new things with a small sample size.

– Small number of users (10-20)

– Rich descriptions

– Collection is done by observing and/or interviewing participants

– Assumes a dynamic and negotiated reality

37
38
Basic quantitative analysis
– Averages
– Mean: add up values and divide by number of data points
– Median: middle value of data when ranked
– Mode: figure that appears most often in the data
– Percentages
– Be careful not to mislead with numbers!
– Graphical representations give overview of data

Number of errors made Internet use Number of errors made

10 4.5
Number of errors made

Number of errors made


< once a day 4
8 3.5
6 once a day 3
2.5
4 2
once a week 1.5
2 1
2 or 3 times a week 0.5
0
0
0 5 10 15 20
once a month 1 3 5 7 9 11 13 15 17
User
User
39
Basic qualitative analysis
– Recurring patterns or themes
– Emergent from data, dependent on observation framework if used
– Categorizing data
– Categorization scheme may be emergent or pre-specified
– Looking for critical incidents
– Helps to focus in on key events

40
Tools to support data analysis

– Spreadsheet – simple to use, basic graphs


– Statistical packages, e.g. SPSS, R
– Qualitative data analysis tools
– Categorization and theme-based analysis
– Quantitative analysis of text-based data
– Nvivo and Atlas.ti support qualitative data analysis
– CAQDAS Networking Project, based at the University of Surrey
(http://caqdas.soc.surrey.ac.uk/)

41
SWITCH POWERPOINT
Data Interpretation

43
Data interpretation
– Start soon after data gathering session

– Initial interpretation before deeper analysis

– Different approaches emphasize different elements e.g. class diagrams for object-
oriented systems, entity-relationship diagrams for data intensive systems

44
Task descriptions
– Scenarios
– an informal narrative story, simple, ‘natural’, personal, not generalisable

– Use cases
– assume interaction with a system
– assume detailed understanding of the interaction

– Essential use cases


– abstract away from the details
– does not have the same assumptions as use cases

45
Scenarios - to identify potential vacation options
The Thomson family enjoy outdoor activities and want to try their hand at sailing
this year. There are four family members: Sky (10 years old), Eamonn (15 years old),
Claire (35), and Will (40). One evening after dinner they decide to start exploring
the possibilities. They all gather around the travel organizer and enter their initial
set of requirements – a sailing trip for four novices in the Mediterranean. The
console is designed so that all members of the family can interact easily and
comfortably with it. The system's initial suggestion is a flotilla, where several crews
(with various levels of experience) sail together on separate boats. Sky and Eamonn
aren’t very happy at the idea of going on vacation with a group of other people,
even though the Thomsons would have their own boat. The travel organizer shows
them descriptions of flotillas from other children their ages and they are all very
positive, so eventually, everyone agrees to explore flotilla opportunities. Will
confirms this recommendation and asks for detailed options. As it’s getting late, he
asks for the details to be saved so everyone can consider them tomorrow. The
travel organizer emails them a summary of the different options available.
46
Persona

– a representation of a type of user (archetypical users)

– answer the question, “Who are we designing for?”, “What are we designing for?”

– behavior patterns, goals, skills, attitudes, and background information

47
48
Steps to creating persona
1. Collect the information about your users
2. Identify behavioral patterns from research data
3. Create personas and prioritize them
4. Find scenario(s) of interaction and create persona documentation
5. Share your findings and obtain acceptance from the team

49
Questions to ask during
persona development

50
Persona types - primary persona
– attract the most of the attention during the development process.

– have attributes which are shared by a big part of the target groups (main target).

– You develop for this persona and their goals.

– 1 primary user persona for a system.

51
Persona types - secondary persona
– relevant for specific requirements which would not fit into the profile of Primary
Personas.

– other users who are mostly satisfied with the product, but have some additional
requirements.

– It represent requirements of customers which


could not be incorporated into primary Personas.

– No more than 2 secondary personas.

52
Use case
– written description of how users will perform tasks on your system

– beginning with a user's goal and ending when that goal is fulfilled.

What Use Cases Include What Use Cases Do NOT Include


– Who is using the website – Implementation-specific language
– What the user want to do – Details about the user interfaces or screens.
– The user's goal
– The steps the user takes to accomplish a
particular task
– How the website should respond to an
action

53
Elements of use case
– Actor
– Stakeholder
– Primary Actor
– Preconditions
– Triggers
– Main success scenarios
– Alterntive paths

54
Use case for travel organizer
– The system displays options for investigating visa and vaccination requirements.
– The user chooses the option to find out about visa requirements.
– The system prompts user for the name of the destination country.
– The user enters the country s name.
– The system checks that the country is valid.
– The system prompts the user for her nationality.
– The user enters her nationality.
– The system checks the visa requirements of the entered country for a passport
holder of her nationality.
– The system displays the visa requirements.
– The system displays the option to print out the visa requirements.
– The user chooses to print the requirements.

55
Alternative courses for travel organizer
Some alternative courses:
6. If the country name is invalid:
6.1 The system displays an error message.
6.2 The system returns to step 3.
8. If the nationality is invalid:
8.1 The system displays an error message.
8.2 The system returns to step 6.
9. If no information about visa requirements is found:
9.1 The system displays a suitable message.
9.2 The system returns to step 1.

56
Example use case diagram for travel organizer

57
Empirical Methods (Cognitive aspects I)

Ch 8, 9, 10, 11

Morten Fjeld, t2i Lab, CSE, Chalmers 1/63


Historic Outlook (1/2)

Do you know where mobile towers are located?

Morten Fjeld, t2i Lab, CSE, Chalmers 2/63


Historic Outlook (1/2)

Comparative evaluation of UIs is not novel idea


•1503

benchmark TUI
cond. cond.

https://en.wikipedia.org/wiki/Abacus

Morten Fjeld, t2i Lab, CSE, Chalmers 3/63


Historic Outlook (2/2)

Comparative evaluation of UIs is not novel idea

--1980 1981 --

slide rule vs. HP 12c, financial calculator using


Reverse Polish Notation (RPN)

Morten Fjeld, t2i Lab, CSE, Chalmers 4/63


Lecture Overview (1/2)

ISO Usability Definition, revisited

Evaluation: Goals of
and Types of Evaluation

Evaluation Procedure

……

Morten Fjeld, t2i Lab, CSE, Chalmers 5/63


Lecture Overview (2/2)

C. Fuglesang
Some empirical methods for
Mobile UIs
NASA TLX M. Carlsen

Eye Tracking
PAD (pleasure-arousal-dominance)
AttrakDiff (hedonic/pragmatic)
Example: UbiSwarm

Morten Fjeld, t2i Lab, CSE, Chalmers 6/63


ISO Definition of Usability

"[Usability refers to] the extent to which a


product can be used by specified users to
achieve specified goals with effectiveness,
efficiency and satisfaction in a specified
context of use." - ISO 9241-11

See also: http://en.wikipedia.org/wiki/Usability#ISO_standard

Morten Fjeld, t2i Lab, CSE, Chalmers 7/63


ISO Definition of Usability

however … necessary … is not sufficient …

Source: https://www.interaction-design.org/literature/topics/usability

Morten Fjeld, t2i Lab, CSE, Chalmers 8/63


Quality measure of TUIs

In the field of Human-Computer Interaction (HCI)


the usability of a program or tool is often
measured in terms of ISO standards:
efficiency, effectiveness, and user satisfaction.

ISO 9241-11. Ergonomic requirements for office work with visual display terminals (VDTs) –
Part 11: Guidance on usability (1998).

Morten Fjeld, t2i Lab, CSE, Chalmers 9/63


Goals of Evaluation

Aims to evaluate:
new existing (benchmark)

i) one new system vs. one existing system


OR
ii) new alternative a vs. new alternative b

new alt. a new alt. b

1. Assessing system functionality


2. Assessing effect of interface on use
3. Identifying specific problems
Source: https://designbuzz.com/another-wmd-for-the-high-schooler-queen-bee-s-kitty/
Morten Fjeld, t2i Lab, CSE, Chalmers 10/63
Types of Evaluation

A few desktop methods (“no lab”)

Laboratory studies (“lab required”)


Think aloud
Collaborative studies
Interviews and questionnaires
Physiological methods

Field studies (“no desktop, no lab”)

Morten Fjeld, t2i Lab, CSE, Chalmers 11/63


A few desktop methods (no lab needed)

• Cognitive Walkthrough
• Usability Heuristic
• Review-based evaluation
• Others

Morten Fjeld, t2i Lab, CSE, Chalmers 12/63


Cognitive Walkthrough
Proposed by Polson et al.

Requires a detailed review of a sequence of actions. Action refers


to a step that interface requires a user to perform in order to
accomplish a task.

The main focus is to establish how easy a system is to learn,


learning through exploration.

Usually performed by expert in cognitive psychology

Expert ‘walks through’ design to identify potential problems using


psychological principles

Forms used to guide analysis


Example: http://www.cs4fn.org/usability/cogwalkthrough.php
Morten Fjeld, t2i Lab, CSE, Chalmers 13/63
Cognitive Walkthrough

For each action, the evaluator needs to answer 4


questions

1. Is the effect of the action the same as the user’s


goal at that point?

2. Will users see that the action is available?

3. Once user have found the correct action, will they


know it is the one they need?

4. After the action is taken, will users understand the


feedback they get?
Check out intro to cognitive walkthrough:
https://www.youtube.com/watch?v=CeWAbGU5cSw 3:43 minutes
Morten Fjeld, t2i Lab, CSE, Chalmers 14/63
Jakob Nielsen’s Ten Usability Heuristics

1. Visibility of system status


2. Match between system and the real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose,
and recover from errors
10. Help and documentation
Video intro, a bit longer than here: https://www.youtube.com/watch?v=_RxfU6dPZuU

Morten Fjeld, t2i Lab, CSE, Chalmers 15/63


Jakob Nielsen’s Ten Usability Heuristics

•To apply usability heuristics, read and use:


http://www.useit.com/papers/heuristic/heuristic_list.html
http://www.useit.com/papers/heuristic/heuristic_evaluation.html
•5-6 experts, using this list, may discover up to 80% of design errors.

Morten Fjeld, t2i Lab, CSE, Chalmers 16/63


Other “desktop” evaluation methods

Review-based:
Results from the literature used to support or
refute parts of design.
Care needed to ensure results are transferable
to new design.

Model-based evaluation
Cognitive models used to filter design options
e.g. GOMS prediction of user performance.

Design rationale (a log-book)

Morten Fjeld, t2i Lab, CSE, Chalmers 17/63


Laboratory studies

Advantages:
specialist equipment available
uninterrupted environment
Repeated measurement
Disadvantages:
lack of context
difficult to observe several users cooperating
Appropriate
if system location is dangerous or impractical

Morten Fjeld, t2i Lab, CSE, Chalmers 18/63


Experimental evaluation

Requires an artefact:
simulation, prototype, full implementation

Morten Fjeld, t2i Lab, CSE, Chalmers 19/63


Laboratory studies, test person

IBM Usability Lab, test subject behind glass, experts gathered


around screens. From: http://www.davidpereira.com/blog/?p=10

Morten Fjeld, t2i Lab, CSE, Chalmers 20/63


Laboratory studies, experimental leader

View from test leader

Morten Fjeld, t2i Lab, CSE, Chalmers 21/63


Example: Switching Mobile Phone

•Usability lab:
facilitator (1) participant
(2)
•camera (3
• microphone (4
mobile phone (5)
one-way mirror (6)
This paper was done as. proje technicians 1-3 (7-9)
https://scholar.google.com/scholar?hl=sv&as_sdt=0%2C5&q=Exploring+potential+usability+gaps+when+switching+mobile+phones%3A+an+empirical+study&btnG=
Fallas Yamashita, A., Barendregt, W., Fjeld, M. (2007). Exploring potential usability gaps when switching mobile phones:
An empirical study. The 21st BCS HCI Group conference, pp.109-116.
https://pdfs.semanticscholar.org/2333/8340dceabed330ca87b924b106bbe179a764.pdf.

Morten Fjeld, t2i Lab, CSE, Chalmers 22/63


Experimental evaluation

controlled evaluation of specific aspects of


interactive behaviour
evaluator chooses hypotheses to be tested
a number of experimental conditions are
considered which differ only in the value of some
controlled variable.
changes in behavioural measure are attributed
to different conditions

Morten Fjeld, t2i Lab, CSE, Chalmers 23/63


Experimental factors

Subjects
who – representative, sufficient sample
Variables
things to modify and measure
Hypothesis
what you’d like to show
Experimental design
how you are going to do it

Morten Fjeld, t2i Lab, CSE, Chalmers 24/63


Variables

independent variable (IV)


characteristic changed to produce different
conditions
e.g. interface style, number of menu items

dependent variable (DV)


characteristics measured in the experiment
e.g. time taken, number of errors.

Morten Fjeld, t2i Lab, CSE, Chalmers 25/63


Hypothesis

prediction of outcome
framed in terms of IV and DV

e.g. “error rate will increase as font size


decreases”

null hypothesis:
states no difference between conditions
aim is to disprove this

e.g. null hyp. = “no change in error rate with font


size”

Morten Fjeld, t2i Lab, CSE, Chalmers 26/63


Experimental design

within-groups design
This type of experimental design is when
one set of participants are tested more
than once and their scores are compared.
Textbook, pp. 49-52 and 76

between-groups design
An experiment that has two or more groups of
subjects each being tested by a different testing
factor simultaneously
Textbook, pp. 49-52

Choosing between the two ones, pp. 54-55


Source: https://www.elsevier.com/books/research-methods-in-human-computer-interaction/lazar/978-0-12-805390-4

Morten Fjeld, t2i Lab, CSE, Chalmers 27/63


Experimental design; pros and cons

within groups design


each subject performs experiment under each
condition
transfer of learning possible
less costly and less likely to suffer from user
variation.
between groups design
each subject performs under only one condition
no transfer of learning
more users required
variation can bias results.

Morten Fjeld, t2i Lab, CSE, Chalmers 28/63


Analysis of data

Before you start to do any statistics


(using SYSTAT, SPSS, R, Excel etc.):
look at data
save original data

Choice of statistical technique depends on


type of data
information required

Type of data
discrete - finite number of values
continuous - any value

Morten Fjeld, t2i Lab, CSE, Chalmers 29/63


Analysis - types of test

parametric
assume normal distribution
robust
powerful

non-parametric
do not assume normal distribution
less powerful
more reliable

contingency table
classify data by discrete attributes
count number of data items in each group
Morten Fjeld, t2i Lab, CSE, Chalmers 30/63
Analysis of data (cont.)

What information is required?


is there a difference?
how big is the difference?
how accurate is the estimate?

Parametric and non-parametric tests mainly


address first of these

Morten Fjeld, t2i Lab, CSE, Chalmers 31/63


Thinking Aloud

user observed performing task


user asked to describe what he is doing and
why, what he thinks is happening etc.

Advantages
simplicity - requires little expertise
can provide useful insight
can show how system is actually used
Disadvantages
subjective
selective
act of describing may alter task performance

Morten Fjeld, t2i Lab, CSE, Chalmers 32/63


Cooperative evaluation (groups)

More difficult than single-user experiments

Problems with:
subject groups
choice of task
data gathering
analysis

Morten Fjeld, t2i Lab, CSE, Chalmers 33/63


Cooperative evaluation (groups)

variation on think aloud


user collaborates in evaluation
both user and evaluator can ask each other
questions throughout

Additional advantages
less constrained and easier to use
user is encouraged to criticize system
clarification possible

Morten Fjeld, t2i Lab, CSE, Chalmers 34/63


Cooperative evaluation (groups)

Source: https://dl.acm.org/citation.cfm?doid=2671015.2671016
Morten Fjeld, t2i Lab, CSE, Chalmers 35/63
Interviews
analyst questions user on one-to -one basis
usually based on prepared questions
informal, subjective and relatively cheap

Advantages
can be varied to suit context
issues can be explored more fully
can elicit user views and identify unanticipated
problems
Disadvantages
very subjective
time consuming

Morten Fjeld, t2i Lab, CSE, Chalmers 36/63


Questionnaires

Set of fixed questions given to users

Advantages
quick and reaches large user group
can be analyzed more rigorously
Disadvantages
less flexible
less probing

Morten Fjeld, t2i Lab, CSE, Chalmers 37/63


Questionnaires (ctd)

Need careful design


what information is required?
how are answers to be analyzed?

Styles of question
general
open-ended
scalar
multi-choice
ranked

Morten Fjeld, t2i Lab, CSE, Chalmers 38/63


Leaving your office: Field Studies

Brief overview of field studies: https://www.nngroup.com/consulting-field-studies/

Morten Fjeld, t2i Lab, CSE, Chalmers 39/63


Field Studies

Advantages:
natural environment
context retained (though observation may alter it)
longitudinal studies possible
Disadvantages:
distractions
noise
Appropriate
where context is crucial for longitudinal studies

Morten Fjeld, t2i Lab, CSE, Chalmers 40/63


Staying Indoors:
Selection of Method for TUIs
• NASA-TLX
• Eye tracking
• Physiological measurement
• Pleasure-Arousal-Dominance (PAD)
• Hedonic quality; AttrakDiff
• Example 1: UbiSwarm: PAD and AttrakDiff

Morten Fjeld, t2i Lab, CSE, Chalmers 41/63


NASA-TLX

• Mental Demand
• Physical Demand
• Temporal Demand
• Performance
• Frustration
• Effort
• But, is all equally important?

Morten Fjeld, t2i Lab, CSE, Chalmers 42/63


NASA-TLX

Morten Fjeld, t2i Lab, CSE, Chalmers 43/63


NASA-TLX

C. Fuglesang
Prof. KTH

C. Fuglesang
Prof. KTH

https://sv.wikipedia.org/wiki/Christer_Fuglesang
Morten Fjeld, t2i Lab, CSE, Chalmers 44/63
NASA-TLX

• Analysis of data collected, choose:


“But, is all equally important?”
if NO then
individual weighting (=> “cards”)
else
Raw TLX
end if;
• Administer NASA-TLX; choose:
Pen and paper OR computer
Morten Fjeld, t2i Lab, CSE, Chalmers 45/63
Eye tracking: Tobii (Sthlm)

https://www.tobii.com

Morten Fjeld, t2i Lab, CSE, Chalmers 46/63


Eye tracking: Tobii (Sthlm)

https://www.tobii.com
https://en.wikipedia.org/wiki/Tobii_Technology
Morten Fjeld, t2i Lab, CSE, Chalmers 47/63
Eye tracking: SmartEye (Gbg)

• ”Local tech”:
http://www.smarteye.se/

Morten Fjeld, t2i Lab, CSE, Chalmers 48/63


Eye tracking: Noldus (NL)

Rapidly gather rich and meaningful data.


Record time automatically and accurately.
Integrate video and physiology in behavioral studies.
Calculate statistics, assess reliability, and create transition
matrices.
Make clips of those parts of video and data of interest to you.
Analyze groups of observations at once.

http://www.noldus.com/

Morten Fjeld, t2i Lab, CSE, Chalmers 49/63


Eye tracking: SMI (disappeared 2017 …)

SMI acquired by Apple: https://techcrunch.com/2017/06/26/apple-acquires-smi-eye-tracking-company/

Example of use: https://www.smivision.com

Morten Fjeld, t2i Lab, CSE, Chalmers 50/63


Eye tracking: Pupil Lab

Morten Fjeld, t2i Lab, CSE, Chalmers 51/63


Physiological measurements

Emotional response linked to physical changes


These may help determine a user’s reaction to a UI

Measurements include:
heart activity, including blood pressure, volume & pulse
activity of sweat glands: Galvanic Skin Response (GSR)
electrical activity in muscle: electromyogram (EMG)
electrical activity in brain: electroencephalogram (EEG)

Morten Fjeld, t2i Lab, CSE, Chalmers 52/63


Quantifying Human Emotions in TUI

Pleasure-Arousal-Dominance (PAD)

Morten Fjeld, t2i Lab, CSE, Chalmers 53/63


Quantifying Human Emotions in Mobile UIs

ALBERT MEHRABIAN, UCLA, 1996:


Evidence bearing on the Pleasure-Arousal-Dominance (PAD)
Emotional State Model was reviewed and showed that its
three nearly orthogonal dimensions provided a sufficiently
comprehensive description of emotional states.

Source: http://www.springerlink.com/content/g071r4n59240u537/fulltext.pdf

Morten Fjeld, t2i Lab, CSE, Chalmers 54/63


Hedonic Quality and Pragmatic Quality

Marc Hassenzahl. 2004. The Interplay of Beauty, Goodness, and Usability in Interactive
Products. Human Computer Interaction 19, 4 (Dec. 2004), 319–349.
https://doi.org/10.1207/s15327051hci1904_2

Morten Fjeld, t2i Lab, CSE, Chalmers 55/63


AttrakDiff: A Method to evaluating the
usability and design of an interactive product

hedonic quality
• the aspects of a user
interface that appeal to a
person's desire of
pleasure and avoidance of
boredom and discomfort.
• the aspects that are fun,
original, interesting,
engaging, and cool
• a positive subjective
experience

Tool: http://attrakdiff.de/index-en.html#nutzen

Morten Fjeld, t2i Lab, CSE, Chalmers 56/63


AttrakDiff: A Method to evaluating the
usability and design of an interactive product

•Digital musical instruments

Source: http://www.mdpi.com/2075-1702/3/4/317/htm

Morten Fjeld, t2i Lab, CSE, Chalmers 57/63


Actuated Tangible User Interfaces
Video:
https://www.youtube.com/watch?v=oT7theBRBzI&t=1s

GaussBricks, 2014 (link below)


UbiSwarm, 2017:
https://dl.acm.org/citation.cfm?doid=3139486.3130931
Pico, 2005,
http://tangible.media.mit.edu/project/pico/

Gauss, 2012-2016: http://www.cmlab.csie.ntu.edu.tw/~howieliang/HCIProjects/projectGauss.htm


Morten Fjeld, t2i Lab, CSE, Chalmers 58/63
UbiSwarm: PAD and TUIs come together

Paper: https://dl.acm.org/citation.cfm?doid=3139486.3130931
Video: https://www.youtube.com/watch?v=oT7theBRBzI&feature=youtu.be

Morten Fjeld, t2i Lab, CSE, Chalmers 59/63


UbiSwarm: Ubiquitous Robotic Interfaces and
Investigation of Abstract Motion as a Display

Paper: https://dl.acm.org/citation.cfm?doid=3139486.3130931
Video: https://www.youtube.com/watch?v=oT7theBRBzI&feature=youtu.be

Morten Fjeld, t2i Lab, CSE, Chalmers 60/63


UbiSwarm: Ubiquitous Robotic Interfaces and
Investigation of Abstract Motion as a Display

Paper: https://dl.acm.org/citation.cfm?doid=3139486.3130931
Video: https://www.youtube.com/watch?v=oT7theBRBzI&feature=youtu.be

Morten Fjeld, t2i Lab, CSE, Chalmers 61/63


UbiSwarm: Ubiquitous Robotic Interfaces and
Investigation of Abstract Motion as a Display

Paper: https://dl.acm.org/citation.cfm?doid=3139486.3130931
Video: https://www.youtube.com/watch?v=oT7theBRBzI&feature=youtu.be

Morten Fjeld, t2i Lab, CSE, Chalmers 62/63


UbiSwarm: Ubiquitous Robotic Interfaces and
Investigation of Abstract Motion as a Display

Paper: https://dl.acm.org/citation.cfm?doid=3139486.3130931
Video: https://www.youtube.com/watch?v=oT7theBRBzI&feature=youtu.be

Morten Fjeld, t2i Lab, CSE, Chalmers 63/63


Empirical Methods with Application
for Mobile User Interfaces, final advice:
Consult textbooks in this area, try out,
familiarize, there is no perfect method!

introductory research-oriented advanced/good! mobile-oriented (…)

Morten Fjeld, t2i Lab, CSE, Chalmers 64/63


Design & Prototyping
Lecture Notes

1
Outline
– What is a prototype?
– Why prototype?
– Fidelities of prototyping
– Low fidelity
– High fidelity
– Prototyping Tools
– Page scanning patterns; eye tracking
– Design trends
– Physical prototyping
– Prototyping wearable devices (research example A)

– Prototyping bike display technology (research example B)


2
Prototyping

3
Four basic activities in interaction design

4
5
The image part with relationship ID rId3 was not found in the file.

6
What is a prototype?
A prototype is an early sample or mock-up of the product you wish to build.

It’s a quick model explaining the actual plans for the final product.

7
What is a prototype?
…more specifically: In interaction design it can be
– a series of screen sketches
– a storyboard
– a powerpoint slide show
– a video
– a cardboard mock-up
– a piece of software with limited functionality

8
What is a prototype?
Valuable for requirements elicitation because users can experiment with the
system and point out its strengths and weaknesses.

9
Why prototype?
– You can test out ideas for yourself

– It encourages reflection

– Stakeholders can see, hold, interact


with a prototype more easily than a
document or a drawing

If you give them something that they get to use, they know what they don’t want (by Steve Jobs)
10
Why prototype?
– To better understand how users will interact with your final artifact.

– The prototype can reveal errors and omissions in the requirements.

– Users gain a sense of ownership of the final product.

11
Do
– When creating interactive high-fidelity prototypes and simulations,
build in realistic delays

12
Don’t
– Don’t prototype features or functionality that cannot be implemented.

– Don’t begin prototype review sessions without clear guidelines for feedback.

13
Don’t
– Don’t be a perfectionist.

– Don’t prototype everything. Most of the time, you shouldn’t have to.

14
Fidelities of prototyping
– Fidelity of a prototype denotes the prototype’s resemblance to a finished system

15
Fidelities of Prototyping

16
Low-fidelity prototyping
– It’s often paper-based and do not allow user
interactions.

– It’s helpful in enabling early visualization of


alternative design solutions.

– It’s perfect at the early stages and are


refined throughout the process.

17
Low-fidelity prototyping
– Storyboards
– Sketching
– Card-based (paper) prototypes
– Whiteboards
– Flip charts
– Post-it notes
– Index cards
– Wizard-of-Oz
– Design tool

https://www.simpleusability.com/inspiration/2018/08/wizard-of-oz-testing-a-method-of-testing-a-system-that-does-not-yet-exist/
18
Storyboards
– Often used with scenarios, bringing
more detail, and a chance to role play

– It is a series of sketches showing how a


user might progress through a task using
the device

– Used early in design

Bill Buxton on sketching experiences,


Institute of Design Strategy
Conference, May 2008
https://vimeo.com/5189134
19
or: https://www.youtube.com/watch?v=xx1WveKV7aE.
Storyboards
1. Start with a plain text and arrows.

2. Add emotion to your story.

20
Storyboards
3. Translate each step into a frame

4. Design clear outcome.

21
Wireframing

22
Paper prototyping
– It allows you to prototype a digital product interface without using digital
software.

(1) To communicate ideas


(2) As a usability testing technique

– The technique is based on creating hand drawings


of different screens that represent user interfaces
of a product.

23
‘Wizard-of-Oz’ prototyping
– The user thinks they are interacting with a computer, but a developer is
responding to output rather than the system.

– Usually done early in design to


understand users’ expectations

– What is ‘wrong’ with this approach?

User

>Blurb blurb
>Do this
>Why?

24
High-fidelity prototyping
– It is computer-based, and usually allow realistic user interactions.

– It takes you as close as possible to a true representation of the user interface.

– It is much more effective in


collecting true human performance data,
demonstrating actual products to clients, management, and others.

25
High-fidelity prototyping
Visual design: Realistic and detailed design.

Content: Designers use real or similar-to-real content.

Interactivity: Prototypes are highly realistic in their interactions.

26
High-fidelity prototyping
– It represents the core functionality of the products user
interface.

– It is interactive systems.

– Users can
enter data in entry fields,
respond to messages,
select icon to open windows
interact with user interface

27
28
Low-fidelity prototyping – pros vs. cons
Advantages
– Lower development cost
– Evaluates multiple design concepts
– Useful communication device
– Addresses screen layout issues
– Proof of concept

Disadvantages
– Limited error checking
– Poor detailed specification to code to
– Facilitator driven
– Limited utility after requirements established
– Limited usefulness for usability tests
– Navigational and flow limitations
29
High-fidelity prototyping – pros vs. cons
Advantages
– Complete functionality
– Fully interactive
– User driven
– Use for exploration and test
– Look and feel of final product
– Serves as a living specification

Disadvantages
– More resource-intensive to develop
– Time-consuming to create
– Inefficient for proof-of-concept design
– Not effective for requirement gathering

30
Sketch => Wireframe => Lo-fi Prototype => Hi-fi Mockup => Hi-fi Prototype (Rapid) => Code

31
Tools & Techniques

32
http://prototypingtools.co/

33
Balsamiq mockup

34
https://balsamiq.com
Axure

35
http://www.axure.com/
Design Better Pages and Layouts

36
Page scanning patterns; eye tracking

37
Page scanning patterns; eye tracking in VR

https://www.tobiipro.com/fields-of-use/immersive-vr-research/

38
Page scanning patterns - F-Pattern
ACM Eye Tracking conference:
https://etra.acm.org/2020/

39
Page scanning patterns - F-Pattern

https://mymodernmet.com/regional-musical-preferences-heat-map/

40
Scanning patterns

https://www.youtube.com/watch?v=SQxrsUXqKCM.

41
Page scanning patterns - Z-Pattern

42
Design Trends

43
Flat UI

44
Material UI

45
Flat vs. material design

46
Physical prototyping

47
Physical prototyping

• Get feedback on our design faster; saves money


• Experiment with alternative designs
• Fix problems before code is written
• Keep the design centered on the user

Source: Adapted from James Landay


Physical prototyping

Fidelity refers to the level of detail:

• Low-fidelity:
• artists renditions with many details missing
• High-fidelity:
• prototypes look like the final product

MakerBot Industries: 3D printers


Jim Rudd, Ken Stern, Scott Isensee (1996): Low vs. high-fidelity prototyping debate. interactions 3-1, pp. 76-85.
Physical prototyping

Jonasson, P. A., Fjeld, M., Fallas Yamashita, A. (2007): Expert Habits vs. UI Improvements: Re-Design of a Room
Booking System. The 21st BCS HCI Group conference, pp. 51-54.
http://www.t2i.se/pub/papers/paper_300.pdf

Source: Jeop Frens, ETH Zurich and TU Eindhoven, http://www.richinteraction.nl/


Physical prototyping

Testing SW and HW for Mobile UI Design


Hyungjun Park, Hee-Cheol Moon, and Jae Yeol Lee (2009): Tangible augmented prototyping of digital
handheld products. Comput. Ind. 60, 2 (February 2009), pp. 114-125.
http://dl.acm.org/citation.cfm?id=1501228
Physical prototyping
FIDELITY : Low – High

FORM : 2D Paper – 3D Physical – Digital

COMBINING FIDELITY AND FORM:


Prototypes will have a fidelity (low to high);
at the same time they can be built using 2D paper, 3D physical
material, digital material, or combinations of these.
Prototyping Wearable Devices
Rocsole Inc. Flow Watch

53 05.10.21
05.10.21 53
Prototyping Wearable Devices
Rocsole Inc. Flow Watch

54 05.10.21
05.10.21 54
Prototyping Wearable Devices

05.10.21 55/12
Design & Prototyping
Lecture Notes

1
Outline
– What is a prototype?
– Why prototype?
– Fidelities of prototyping
– Low fidelity
– High fidelity
– Prototyping Tools
– Page scanning patterns; eye tracking
– Design trends
– Physical prototyping
– Prototyping wearable devices (research example A)

– Prototyping bike display technology (research example B)


2
Prototyping

3
Four basic activities in interaction design

4
5
The image part with relationship ID rId3 was not found in the file.

6
What is a prototype?
A prototype is an early sample or mock-up of the product you wish to build.

It’s a quick model explaining the actual plans for the final product.

7
What is a prototype?
…more specifically: In interaction design it can be
– a series of screen sketches
– a storyboard
– a powerpoint slide show
– a video
– a cardboard mock-up
– a piece of software with limited functionality

8
What is a prototype?
Valuable for requirements elicitation because users can experiment with the
system and point out its strengths and weaknesses.

9
Why prototype?
– You can test out ideas for yourself

– It encourages reflection

– Stakeholders can see, hold, interact


with a prototype more easily than a
document or a drawing

If you give them something that they get to use, they know what they don’t want (by Steve Jobs)
10
Why prototype?
– To better understand how users will interact with your final artifact.

– The prototype can reveal errors and omissions in the requirements.

– Users gain a sense of ownership of the final product.

11
Do
– When creating interactive high-fidelity prototypes and simulations,
build in realistic delays

12
Don’t
– Don’t prototype features or functionality that cannot be implemented.

– Don’t begin prototype review sessions without clear guidelines for feedback.

13
Don’t
– Don’t be a perfectionist.

– Don’t prototype everything. Most of the time, you shouldn’t have to.

14
Fidelities of prototyping
– Fidelity of a prototype denotes the prototype’s resemblance to a finished system

15
Fidelities of Prototyping

16
Low-fidelity prototyping
– It’s often paper-based and do not allow user
interactions.

– It’s helpful in enabling early visualization of


alternative design solutions.

– It’s perfect at the early stages and are


refined throughout the process.

17
Low-fidelity prototyping
– Storyboards
– Sketching
– Card-based (paper) prototypes
– Whiteboards
– Flip charts
– Post-it notes
– Index cards
– Wizard-of-Oz
– Design tool

https://www.simpleusability.com/inspiration/2018/08/wizard-of-oz-testing-a-method-of-testing-a-system-that-does-not-yet-exist/
18
Storyboards
– Often used with scenarios, bringing
more detail, and a chance to role play

– It is a series of sketches showing how a


user might progress through a task using
the device

– Used early in design

Bill Buxton on sketching experiences,


Institute of Design Strategy
Conference, May 2008
https://vimeo.com/5189134
19
or: https://www.youtube.com/watch?v=xx1WveKV7aE.
Storyboards
1. Start with a plain text and arrows.

2. Add emotion to your story.

20
Storyboards
3. Translate each step into a frame

4. Design clear outcome.

21
Wireframing

22
Paper prototyping
– It allows you to prototype a digital product interface without using digital
software.

(1) To communicate ideas


(2) As a usability testing technique

– The technique is based on creating hand drawings


of different screens that represent user interfaces
of a product.

23
‘Wizard-of-Oz’ prototyping
– The user thinks they are interacting with a computer, but a developer is
responding to output rather than the system.

– Usually done early in design to


understand users’ expectations

– What is ‘wrong’ with this approach?

User

>Blurb blurb
>Do this
>Why?

24
High-fidelity prototyping
– It is computer-based, and usually allow realistic user interactions.

– It takes you as close as possible to a true representation of the user interface.

– It is much more effective in


collecting true human performance data,
demonstrating actual products to clients, management, and others.

25
High-fidelity prototyping
Visual design: Realistic and detailed design.

Content: Designers use real or similar-to-real content.

Interactivity: Prototypes are highly realistic in their interactions.

26
High-fidelity prototyping
– It represents the core functionality of the products user
interface.

– It is interactive systems.

– Users can
enter data in entry fields,
respond to messages,
select icon to open windows
interact with user interface

27
28
Low-fidelity prototyping – pros vs. cons
Advantages
– Lower development cost
– Evaluates multiple design concepts
– Useful communication device
– Addresses screen layout issues
– Proof of concept

Disadvantages
– Limited error checking
– Poor detailed specification to code to
– Facilitator driven
– Limited utility after requirements established
– Limited usefulness for usability tests
– Navigational and flow limitations
29
High-fidelity prototyping – pros vs. cons
Advantages
– Complete functionality
– Fully interactive
– User driven
– Use for exploration and test
– Look and feel of final product
– Serves as a living specification

Disadvantages
– More resource-intensive to develop
– Time-consuming to create
– Inefficient for proof-of-concept design
– Not effective for requirement gathering

30
Sketch => Wireframe => Lo-fi Prototype => Hi-fi Mockup => Hi-fi Prototype (Rapid) => Code

31
Tools & Techniques

32
http://prototypingtools.co/

33
Balsamiq mockup

34
https://balsamiq.com
Axure

35
http://www.axure.com/
Design Better Pages and Layouts

36
Page scanning patterns; eye tracking

37
Page scanning patterns; eye tracking in VR

https://www.tobiipro.com/fields-of-use/immersive-vr-research/

38
Page scanning patterns - F-Pattern
ACM Eye Tracking conference:
https://etra.acm.org/2020/

39
Page scanning patterns - F-Pattern

https://mymodernmet.com/regional-musical-preferences-heat-map/

40
Scanning patterns

https://www.youtube.com/watch?v=SQxrsUXqKCM.

41
Page scanning patterns - Z-Pattern

42
Design Trends

43
Flat UI

44
Material UI

45
Flat vs. material design

46
Physical prototyping

47
Physical prototyping

• Get feedback on our design faster; saves money


• Experiment with alternative designs
• Fix problems before code is written
• Keep the design centered on the user

Source: Adapted from James Landay


Physical prototyping

Fidelity refers to the level of detail:

• Low-fidelity:
• artists renditions with many details missing
• High-fidelity:
• prototypes look like the final product

MakerBot Industries: 3D printers


Jim Rudd, Ken Stern, Scott Isensee (1996): Low vs. high-fidelity prototyping debate. interactions 3-1, pp. 76-85.
Physical prototyping

Jonasson, P. A., Fjeld, M., Fallas Yamashita, A. (2007): Expert Habits vs. UI Improvements: Re-Design of a Room
Booking System. The 21st BCS HCI Group conference, pp. 51-54.
http://www.t2i.se/pub/papers/paper_300.pdf

Source: Jeop Frens, ETH Zurich and TU Eindhoven, http://www.richinteraction.nl/


Physical prototyping

Testing SW and HW for Mobile UI Design


Hyungjun Park, Hee-Cheol Moon, and Jae Yeol Lee (2009): Tangible augmented prototyping of digital
handheld products. Comput. Ind. 60, 2 (February 2009), pp. 114-125.
http://dl.acm.org/citation.cfm?id=1501228
Physical prototyping
FIDELITY : Low – High

FORM : 2D Paper – 3D Physical – Digital

COMBINING FIDELITY AND FORM:


Prototypes will have a fidelity (low to high);
at the same time they can be built using 2D paper, 3D physical
material, digital material, or combinations of these.
Prototyping Wearable Devices
Rocsole Inc. Flow Watch

53 05.10.21
05.10.21 53
Prototyping Wearable Devices
Rocsole Inc. Flow Watch

54 05.10.21
05.10.21 54
Prototyping Wearable Devices

05.10.21 55/12
Evaluation: From Framework to Measurables

Pointing time and Fitts’ Law

Offering UI at Horizontal or
Vertical Surface?

Touch Sensors and


Understanding Touch

Position and Rate Control; mDOF

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 1/76


Evaluation: From Framework to Measurables

Reach with TUIs at Tables


Quantifying Human Emotions in TUI

TUI examples (A, B, C) and 2 industrial products

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 2/76


Estimating the time of pointing

Why is it important?

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 3/76


Estimating the time of pointing

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 4/76


Fitts’ Law (2/3)

D is the distance from the starting point to the


center of the target.
W is the width of the target measured along the
axis of motion. W can also be thought of as the
allowed error tolerance in the final position, since
the final point of the motion must fall within
W⁄ of the target's center.
2
….

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 5/76


Fitts’ Law (1/3)

….
MT is the average time to complete the
movement.
a and b are constants that depend on the choice
of input device and are usually determined
empirically by regression analysis.
ID is the index of difficulty

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 6/76


Fitts’ Law (3/3): Familiarize with it!

Fitts’ Law demonstrator


predict a time it takes a user to point at an object
using a specific pointing device (such as a mous
It helps us in designing user interfaces (deciding
the location and size of buttons and other
elements) and choosing the right pointing device
for the task., trackball, trackpad, or even a
finger)
Work in pairs, explore this demonstrator:
http://fww.few.vu.nl/hci/interactive/fitts/

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 7/76


Recent Application of Fitts’ Law

Fitts' Throughput and the Remarkable


Case of Touch-based Target Selection

Indirect input (left) requires a cursor as an intermediary. With


direct input (right), actions occur directly on output targets
Paper: http://www.yorku.ca/mack/hcii2015a.html
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 8/76
Recent Application of Fitts’ Law

Task categories for study

Fitts' law tasks: 1D vs. 2D. Serial vs. discrete


Paper: http://www.yorku.ca/mack/hcii2015a.html
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 9/76
Recent Application of Fitts’ Law

Example of Fitts' Throughput (bits per second,


bps) for Touch-based Target Selection

Fitts' law tasks: 1D vs. 2D. Serial vs. discrete


Paper: http://www.yorku.ca/mack/hcii2015a.html
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 10/76
Other Similar Laws: Steering Law

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 11/76


Offering UI at Horizontal or Vertical Surface?

(”gorilla fist effect”)

Images from:
Direct-Touch vs. Mouse Input for Tabletop Displays

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 12/76


Offering UI at Horizontal or Vertical Surface?

(”gorilla-arm effect”)

Paper: https://phys.org/news/2017-05-gorilla-arm-fatigue-mid-air-usage.html

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 13/76


Offering UI at Horizontal or Vertical Surface?

(”gorilla-arm effect”)

Mini exercise:
Watch the video, understand
Consumed Endurance (CE)

Paper: http://hci.cs.umanitoba.ca/projects-and-research/details/ce

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 14/76


Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 15/76
Offering UI at Horizontal or Vertical Surface?

This can also be used to Eestimate where to


place tangible devices on a vertical surface

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 16/76


Touch Sensors and Understanding Touch

Estimating the minimal size of a graphical item

Christian Holz (2011): Understanding Touch (UIST 2011)

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 17/76


Touch Sensors and Understanding Touch

Estimating the minimal size of a graphical item

https://en.wikipedia.org/wiki/Sifteo_Cubes

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 18/76


Touch Sensors and Understanding Touch

Mini exercise:
How smart is touch sensing on your mobile device?

Christian Holz (2011): Understanding Touch (UIST 2011)

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 19/76


Touch Sensors and Understanding Touch

Estimating the minimal size of a graphical unit


… and of tangible unit… when graphical and
tangible units work closely within small space

Project Gauss: https://howieliang.github.io/Projects/GaussBricks.html

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 20/76


Position and Rate Control; mDOF

Position and rate control


Discuss: Find out more about this!

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 21/76


Position and Rate Control; mDOF

Position, rate, and acceleration control


Input Transformation Output

Position
Control

Rate
Control

Acceler.
Control

From paper: http://etclab.mie.utoronto.ca/people/shumin_dir/papers/PhD_Thesis/Chapter2/Chapter23.html

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 22/76


Studying 6DOF input devices; mDOF

From paper: http://t2i.se/wp-content/uploads/2014/11/IJHCI_25_7.pdf


and http://t2i.se/wp-content/uploads/2012/11/nchi08_pid05_id1.pdf

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 23/76


Studying 6DOF input devices; mDOF

Video: https://www.youtube.com/watch?v=k9PCdodYSAA
From paper: http://t2i.se/wp-content/uploads/2014/11/IJHCI_25_7.pdf
and http://t2i.se/wp-content/uploads/2012/11/nchi08_pid05_id1.pdf

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 24/76


BSc-project,
Chalmers,
2005

Video: https://www.youtube.com/watch?v=k9PCdodYSAA
From paper: http://t2i.se/wp-content/uploads/2014/11/IJHCI_25_7.pdf
and http://t2i.se/wp-content/uploads/2012/11/nchi08_pid05_id1.pdf

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 25/76


Matching device stiffness and control
order for positioning tasks
PhD-thesis, 2000

From paper: http://t2i.se/wp-content/uploads/2014/11/IJHCI_25_7.pdf

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 26/76


Matching device stiffness and control
order for positioning tasks

From paper: http://t2i.se/wp-content/uploads/2014/11/IJHCI_25_7.pdf

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 27/76


Reach with at Tabletops

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 28/76


Reach with at Tabletops

Models of reach

From paper: http://ieeexplore.ieee.org/abstract/document/1579192/

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 29/76


Working at Interactive Tabletops
Part of my PhD
ETH Zurich,
1997

Fjeld M., Bichsel M., Rauterberg M. (1998) BUILD-IT: An intuitive design tool based on direct object manipulation. In: Wachsmuth I., Fröhlich
M. (eds) Gesture and Sign Language in Human-Computer Interaction. GW 1997. Lecture Notes in Computer Science, vol 1371. Springer,
Berlin, Heidelberg. https://doi.org/10.1007/BFb0053008
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 30/76
Improving Reach at Interactive Tabletops
Part of a MSc-thesis,
Tohoku (JP), 2018

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 31/76


Reach with TUIs at Tabletops

Models of reach

From paper: http://ieeexplore.ieee.org/abstract/document/1579192/

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 32/76


Reach with TUIs at Tabletops

Models of reach

From paper: http://ieeexplore.ieee.org/abstract/document/1579192/

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 33/76


Quantifying Human Emotions in TUI

How to design TUIs that are emotionally


pleasing

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 34/76


Quantifying Human Emotions in TUI

Pleasure-Arousal-Dominance (PAD)

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 35/76


Quantifying Human Emotions in TUI

Pleasure-Arousal-Dominance (PAD)

The Pleasure-Displeasure Scale measures how


pleasant or unpleasant one feels about
something. For instance both anger and fear are
unpleasant emotions, and both score on the
displeasure side. However joy is a pleasant
emotion.[1]

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 36/76


Quantifying Human Emotions in TUI

Pleasure-Arousal-Dominance (PAD)
The Arousal-Nonarousal Scale measures how
energized or soporific one feels. It is not the
intensity of the emotion -- for grief and
depression can be low arousal intense
feelings. While both anger and rage are
unpleasant emotions, rage has a higher
intensity or a higher arousal state. However
boredom, which is also an unpleasant state,
has a low arousal value.[1]

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 37/76


Quantifying Human Emotions in TUI

Pleasure-Arousal-Dominance (PAD)

The Dominance-Submissiveness Scale


represents the controlling and dominant versus
controlled or submissive one feels. For instance
while both fear and anger are unpleasant
emotions, anger is a dominant emotion, while
fear is a submissive emotion.[1]

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 38/76


Example A: Constructive Assemblies

Educational and medical application of tangible Uis in


the project called Cogntive Cubes

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 39/76


Constructive Assemblies (Active Cube)

Active cube shows examples of uses of a series of


sensors, almost all of these:

• Arduino board
• IR distance sensor (GP2D12 or GP2D120X or GP2Y0A21)
• LED/ LDR
• Fade with PWM
• Piezo speaker (knock detection)
• Switch / push button
• RF communication
• Play notes w. speakers
• Processing langage / PureData
• Max/MSP

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 40/76


Active Cube

Active cube is also an example of a form which:

• Can be reproduced into multiple instances

• Can be instantiated for alternative functionality/uses

• Can offer meaningful two-handed coordinated interaction

• Is used to build a game or problem-solving application

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 41/76


Active Cube

Video and paper presentations:


http://www-human.ist.osaka-u.ac.jp/ActiveCube/
Paper presentations of medical uses:http://www-
human.ist.osaka-u.ac.jp/~ehud/Research.html

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 42/76


Background: Related Research Areas

Tangible AR Tangible Tabletops Ambient Uis Embodied Uis

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 43/76


Example B:

Tangible AR: ”Chemical Cubes”:

Project web: http://www.t2i.se/projects.php?project=ac


Video material: https://www.youtube.com/channel/UC-KqoJ77ZfWhGPN4IlLJbmQ

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 44/76


Tangible AR: Overview

Research Question
Experimental Design
Results, mainly subjective results
Re-design
Video of re-designed System
Future Work
Summary

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 45/76


Reserach Question

How does TUI compare to ball-and-stick?

AC BSM

We chose to examine learning


effectiveness and user preferences
CHI 2007 paper: https://dl.acm.org/citation.cfm?doid=1240624.1240745
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 46/76
Experimental design
5 Female, 21 Male
secondary school students

Mean age 17.4

Repeated measure (AB-BA)

Tasks were related to


smaller organic molecules
and functional groups

Problem solving, test,


task load, preference, and
satisfaction were measured

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 47/76


Results: NASA-TLX

Mental demands
and frustration
were higher
for AC than BSM

Physical demands
were lower
for AC than BSM

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 48/76


Results: Tool preference

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 49/76


Results: SUMI scale value, AC only

Learnabiliy
was highest

Efficency
was lowest

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 50/76


Re-design addressen 5 objectives

We aimed to improved the following criteria


Comfort of use
User support MSc-thesis, 2004
Ease of use
Ease of learning the system

Hence, we carried out re-design as follows


GUI developement (easy to use functionality)
Import from external DB (not always build from scratch)
Improved visualization
GUI-TUI integration
(Extended portability)

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 51/76


Re-design: GUI Developement

Offering existing and new functinality

Model import from


internal DB (left) and PubChem (right)

Project material: http://t2i.web.pin.se/projects.php?project=ac


Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 52/76
Re-design: Improved Visualization

Shadow rendering Display configuration

Project material: http://t2i.web.pin.se/projects.php?project=ac


Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 53/76
Re-design: GUI TUI Integration

OpenGL was used to program the GUI


Alpha blending enables GUI TUI focus switch

Project material: http://t2i.web.pin.se/projects.php?project=ac


Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 54/76
BSc-thesis, 2005

Almgren, Joakim, et al. "Tangible User Interface for Chemistry Education: Visualization; Portability; and
Database." SIGRAD 2005 The Annual SIGRAD Conference Special Theme-Mobile Graphics. No. 016.
Linköping University Electronic Press, 2005.

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 55/76


Future Work: Tracked Ball-and-Stick

Ultrasound tracking, radio AM triggered

Project material: http://t2i.web.pin.se/projects.php?project=ac


Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 56/76
Background: Related Research Areas

Tangible AR Tangible Tabletops Ambient Uis Embodied Uis

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 57/76


Example C: Tangible Tabletop for
Emergency Management

http://www.theivac.org/content/pie-stakeholder-advisory-group

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 58/76


Example C: Tangible Tabletop for
Emergency Management

Map interaction screenshot:


§ Shape control is done by moving the nodes (top)
§ Time control employs time-line (bottom)
A. Alavi, B. Clocher, A. Smith, A. Kunz, M. Fjeld (submitted): Multi-State Device Tracking for Tangible
Tabletops. Submitted to SIGRAD 2011, 4 pages.

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 59/76


Example C: Tangible Tabletop for
Emergency Management

Prototypical devices on existing tabletop system:


§ Knobs enabling bimanual browsing
(for instance days and hours) of framed content (L)
§ Adjustable ruler offering to capture map distances (R)
A. Alavi, B. Clocher, A. Smith, A. Kunz, M. Fjeld (submitted): Multi-State Device Tracking for Tangible
Tabletops. Submitted to SIGRAD 2011, 4 pages.

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 60/76


Example C: Tangible Tabletop for
Emergency Management

A. Alavi, A. Kunz, M. Sugimoto, M. Fjeld (in press): Dual Mode IR Position and State Transfer
for Tangible Tabletops. Proc. ACM ITS2011, Kobe, Japan, 2 pages.

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 61/76


Example C: Tangible Tabletop for
Emergency Management
Image03

Video: https://www.youtube.com/watch?v=kiKQ4XqtG28.
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 62/76
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 63/76
June 2011: SONY visits t2i Lab

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 64/76


June 2011: SONY visits t2i Lab

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 65/76


June 2011: SONY visits t2i Lab

Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 66/76


Tangible Tabletops: SONY T

http://www.futurelab.sony.net/T/
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 67/76
Tangible Tabletops: Microsoft Dial

Microsoft's Surface Dial

https://www.youtube.com/watch?v=XBBF4X-frOU&feature=youtu.be&t=25
Morten Fjeld, InofMedia, UiB Introduction to Human-Computer Interaction 68/76
ITS '15: Proceedings of the International Conference on Interactive Tabletops and Surfaces
ACM, 15-18th November 2015, Copenhagen, Denmark

Gesture Bike: Examining Projection Surfaces and Turn


Signal Systems for Urban Cycling

Alexandru Dancu1
Simon Eliasson1

Velko Vechev1

Jean-Elie Barjonet1
Ayca Unluer2,1

Joe Marshall3
Simon Nilson1

Oscar Nygren1 Morten Fjeld1

1 Chalmers University 2 Yildiz Technical University 3 University of Nottingham,


of Technology, Sweden Turkey United Kingdom
1 / 39
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

introduction
Bike VS. Car Production

[Bloomberg, 2010]
World Bicycle Production Efficient, sustainable transportation
3
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

related work
Automotive Head-up Displays
Navdy HUD

youtube.com/watch?v=pKL4PJICS40
Jaguar HUD

youtube.com/watch?v=FeK9IkSD_nI
4
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

related work
Bike Signaling Systems

Blaze XFIRE Zackees


Light in front Light lanes to the sides Turn Signal Cycling Gloves
[blaze.cc, 2014] [thexfire.com, 2012] [zackees.com, 2014]

5
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

related work
Phones on Handlebars

Tacticyle: Exploratory Bicycle Trips Rider Spoke


Bicycle navigation system on mobile phone Location-based artistic experience
Minimal navigation cues Using audio cues and map navigation
[Pielot et. al, 2011] [Rowland, et al. 2009]

6
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

own related work


Mobile Projected Displays

CHI’14 ITS’15 PerDis’15 mobileHCI’15

7
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Bicycle gestures...

http://triathlon.competitor.com/2013/06/training/cycling-hand-signals_78701

8
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Gibson’s driving perception theory...

[Gibson and Crooks, 1938]


9
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

contents

10
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

11
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

12
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

13
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=12,
average age 28
(s.d. 7.9)
within subjects

14
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=12,
average age 28
(s.d. 7.9)
within subjects

15
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Experiment 1
Video

https://youtu.be/kzCl3MNZkqw

16
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=12, average age 28 (s.d. 7.9)
within subjects

17
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=12, average age 28 (s.d. 7.9)
within subjects
only performance
was significant (p <0.04)

18
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

HUD 68.3 (s.d. 7.8) and for the projector was 69.5 (s.d. 7.6)

Participants
N=12, average age 28 (s.d. 7.9)
within subjects
Likert scale 1 to 5
19
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=12,
average age 28 (s.d. 7.9)
within subjects

20
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

21
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

22
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Experiment 2
Video

https://youtu.be/kzCl3MNZkqw

23
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=8 average age 28.7 (s.d. 3.0)
within subjects
only performance was significant

24
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Gestures 73.7 (s.d. 6.4) and for the Signal Pod was 67.8 (s.d. 6.1)

Participants
N=8, average age 28.7 (s.d. 3.0)
within subjects

25
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=8,
average age 28.7 (s.d. 3.0)
within subjects

26
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Turn Signalling

27
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Experiment 2B
Video

https://youtu.be/kzCl3MNZkqw

28
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Online Survey, Situation Awareness


Mission Awareness Rating Scale (MARS)

Online Survey

4-point scale
very easy,
fairly easy,
somewhat difficult,
very difficult

Participants
N=40,
avg. age 32.8, (s.d 9.8)
from Europe
and North America.

29
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Online Survey, Situation Awareness


Mission Awareness Rating Scale (MARS)

Online Survey

4-point scale
very easy,
fairly easy,
somewhat difficult,
very difficult

Participants
N=40,
avg. age 32.8, (s.d 9.8)
from Europe
and North America.

30
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participants
N=40,
avg. age 32.8, (s.d 9.8)
from Europe and North America.

31
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

Participant Comments

● “It’s good to have the map on all the time, but I


want to know when to look at it.”

● “This kind of system may be useful in teaching


people to use gestures when biking”

● Multiple participants noted that traffic noise


reduced the effectiveness of button beeps as a
feedback system.

● “You don’t have to do anything special, it comes


naturally and it augments the reality.”

32
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

conclusion
Projection vs. Mobile Phone
Smart Flashlight
[2014]

33
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

conclusion
Projection vs. Head-up Display
Gesture Bike
[2015]

34
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

conclusion

35
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

conclusion

● Projections augmenting vehicle headlights with information


○ communicate with pedestrians
○ showing driver’s intention
36
○ computing & visualizing the field of safe travel
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

application

Copenhagen Bike
bycyklen.dk

● Interface design of bicycles


● Bike sharing e.g. Copenhagen bike
● Display closer to normal line of sight could improve
safety

37
Gesture Bike: Examining Projection Surfaces and Turn Signal Systems for Urban Cycling @ ITS’15

application
Nissan
Oct 27, 2015

[https://youtu.be/h-TLo86K7Ck]

38
Thank you for your attention
39
Acknowledgments t2i.se

This work was supported by the EU FP7 People Programme


(Marie Curie Actions) under REA Grant Agreement 290227

More details on
alexandrudancu.com/gesture-bike
40
Recommender Systems &
Machine Learning

Mehdi Elahi
Associate Professor
University of Bergen (UiB)
?
@mehdielaahi @mehdielaahi
About Me: Education

¤ I am an Electrical Engineering (Sweden 2009)

¤ with Ph.D. in Computer Science (Italy 2014)

¤ Postdoc (Italy 2016)


About Me: DARS Lab in Bergen

University of Bergen (UiB)


Top #163 in the World (QS 2020)

Department of Information Science and


Media Studies (InfoMedia)
Outline

¤ About Me

¤ Part 1

¤ Decision Making

¤ Recommender Systems

¤ Examples of new ideas!

¤ Part 2

¤ Open Discussion
Outline

¤ About Me

¤ Part 1

¤ Decision Making

¤ Recommender Systems

¤ Examples of ideas!

¤ Part 2

¤ Open Discussion
Decisions

¤How many decisions do we make in a day?


Decisions

¤We make 35000 decisions in every day!


¤Decisions on what to eat, watch, listen, read,
play, visit, … ?
Decisions

¤How do we make decisions?


Decision Making

v Example: John want to buy a dictionary!

Choice A

Choice B

John
Choice C

Jameson, A., Willemsen, M. C., Felfernig, A., de Gemmis, M., Lops, P., Semeraro, G., & Chen, L. (2015).
Human decision making and recommender systems. In Recommender Systems Handbook (pp. 611-648). Springer, Boston, MA.
Decision Making

v John may follow suggestion of friends!

Jameson, A., Willemsen, M. C., Felfernig, A., de Gemmis, M., Lops, P., Semeraro, G., & Chen, L. (2015).
Human decision making and recommender systems. In Recommender Systems Handbook (pp. 611-648). Springer, Boston, MA.
Decision Making

v May be buying cheapest?

Choice A

Choice B

Choice C

Jameson, A., Willemsen, M. C., Felfernig, A., de Gemmis, M., Lops, P., Semeraro, G., & Chen, L. (2015).
Human decision making and recommender systems. In Recommender Systems Handbook (pp. 611-648). Springer, Boston, MA.
Recommender Systems
¤ Tools that support decision making process
¤ They can learn the user preferences and suggest
products that can be interesting for them.
Recommender Systems

¤ Examples:
Jam Experiment
User Benefits
¤ Recommender Systems can tackle choice overload
¤ Example: Amazon with 500 Million options!!
Variety of Choices

os
de
Vi
Purch
ases

Tweets
Velocity of Choices

Only 1 minute?!
What about 1 hour, or 1 day?!
Benefits: Business
Benefits: Business

boost!
Benefits: Business

¤ More than 30% of Amazon revenue made by


recommendation.

¤ More than 80% Netflix revenue made by


recommendation.
1 Million $ Prize!
Future of Shops
Recommender Systems

Jameson, A., Willemsen, M. C., Felfernig, A., de Gemmis, M., Lops, P., Semeraro, G., & Chen, L. (2015).
Human decision making and recommender systems. In Recommender Systems Handbook (pp. 611-648). Springer, Boston, MA.
Recommender Systems: Data

Data Model
AI Algorithms

learns
User Preferences

AI

CSICC 2021
AI Algorithms

AI

generates

Recommendations

CSICC 2021
AI Algorithms

learns
User Preferences

AI

generates

Recommendations

CSICC 2021
AI Algorithms

learns
User Preferences

AI

generates

Recommendations

CSICC 2021
Data

¤User data (preferences)


¤Explicit feedback (ratings, likes/dislikes)
¤Implicit feedback (clicks, emotions)

¤Item data (content)


¤High level
¤Low level
User Data

Items

5 ? 5 ? 2
? 2 ? 1 ?
Users ? 5 ? ? 3 Known rating
3 ? 4 ? 1
? ? ? ? ? Unknown rating

¤ Explicit feedback: item evaluations that the user


explicitly reports, e.g., five star ratings in Netflix, or
like/dislikes in Facebook

¤ More informative of user preferences; costly to collect.


User Data

Items

1 ? 1 ? 1
? 1 ? 1 ?
Users ? 1 ? ? 1 Known action
1 ? 1 ? 1
? ? ? ? ? Unknown action

¤ Implicit feedback: actions performed by the users on


the items, e.g., video views in YouTube or purchase
history of a user in Amazon.

¤ Less informative of user preferences; cheaper to collect.


User Personality Data

¤ Personality: stable characteristics of users which


accounts for individual differences among them.

Marko Tkalčič, Giovanni Semeraro, Marco de Gemmis, Personality and Emotions in Decision Making and Recommender Systems, DMRS2014
User Personality Data
¤ Personality correlates with people’s preferences
User Personality Data

Marko Tkalčič, Giovanni Semeraro, Marco de Gemmis, Personality and Emotions in Decision Making and Recommender Systems, DMRS2014
User Personality Data
Demo: IBM Watson

¤ Predict personality, needs & values through written text

¤ Understand individual customers’ habits & preferences


Personality Data

Marko Tkalčič, Giovanni Semeraro, Marco de Gemmis, Personality and Emotions in Decision Making and Recommender Systems, DMRS2014
Emotions

Marko Tkalčič, Giovanni Semeraro, Marco de Gemmis, Personality and Emotions in Decision Making and Recommender Systems, DMRS2014
Emotions
Emotions
Emotions

Prediction

46
Emotions

How people think they look like. How they actually look like.
Heart Rate

?
48
Emotions
Friend Recommendation

Lifelogging

50
Friend Recommendation

Recommendation
51
Friend Recommendation

Lifelogging

AI

Recommendation
52
Item Data

¤Item data (content)

¤High level
¤ Item description, category, tags, …

¤Low level (visual)


¤ Light and colors in videos, ...
High Level
Low Level
Evan Almighty 2007

Light is used as a symbol


representing moral goodness, in
contrast to, darkness as evil.
Harry Potter and the Half-Blood Prince 2009

Van Sijll, J. (2005). Cinematic storytelling: The 100 most powerful film conventions every filmmaker must know. Studio City, CA:
Michael Wiese Productions.
Colors

Nightfall (2016)

Colors are used to convey


emotional meanings, e.g.,
expressing loneliness (blue) or
tension (red).
The Lion King (1994)

Van Sijll, J. (2005). Cinematic storytelling: The 100 most powerful film conventions every filmmaker must know. Studio City, CA:
Michael Wiese Productions.
Methodology

video
video

video

1. Video Segmentation
2. Key-Frame Detection
frame frame frame frame frame frame
3. Feature Extraction
4. Feature Aggregation
5. Recommendation
key-frame1 key-frame2

visual visual
features features

<key-frame1, visual features>


<key-frame2, visual features>
.. ..
. .
<key-frameN, visual features>

<movie, aggregated visual


features>
Recommender Systems: Model

Data Model
Recommender Models

Recommender
System
Recommender Models

A. Personalization:
how personalization is performed when
recommending the items to users.

Personalized:
Non-personalized:
recommending different
recommending to all the
items to different users
users the same items
Recommender Models

A. Example

Non-personalized Personalized
Recommender Models

B. Hybridization:
whether the recommender algorithm is based
on a single or multiple heuristics

Single-heuristic: Combined-heuristic
are those models that hybridize single-heuristic
implement a single and models by combining
unique method. several methods.
Recommender Models
v Examples:

Collaborative Filtering
(CF)
Recommender Models
v Examples:

Content-based
(CB)
Recommender Systems: Interface

Data Interface

Model
User Interaction

¤ Active Learning in Netflix


How to Choose?

¤ Not all the ratings of items are equally informative, i.e.,


equally bring information to the system.
Active Learning

¤Active learner uses a set of rules to choose the


best items for the users to rate.
Active Learning

Item Score

1 151
2 44
3 7
4 1
5 42
6 34
7 9
System computes the
8 55
scores according to a
9 20 strategy
… …
N 12
Active Learning

Top 10 Score
items

1 151
8 55
43 54
11 50
2 44

User receives the top 5 42


items computed and 6 34
proposed by system. 22 33
75 29
13 25
Active Learning

Rated
items
1

2
5
The items that are
rated are added to
the train set
75
13
Interface
Interface

2
Why?
3
All these items are predicted 4
to be “liked” by user
Interface
Interface

Watching alone or with friends / family / …?


Interface

Watching on laptop or phone?


Interface

Alexander Felfernig, Biases in Decision Making, DMRS 2014


Ideas

You may have new ideas!

Lets see some examples.


Idea: Tourism

Can I develop restaurant recommender based on feeling!

80
South Tyrol Suggests (STS)
South Tyrol Suggests (STS)

¤ An Android app that recommends


restaurants, museums in the north Italy.

¤ 27,000 attractions!
South Tyrol Suggests (STS)

Openness

Agreeableness Extraversion

Personality Traits

Conscientious-
Neuroticism ness
South Tyrol Suggests (STS)

¤ Using the personality of the user in the


prediction model, the system estimates
which POIs the user likely has experienced,
and hence, can rate.
South Tyrol Suggests (STS)
South Tyrol Suggests (STS)
Idea: Smart Food Recommender

Can I design a recommender that helps people to eat better?

87
ChefPad
ChefPad
ChefPad
ChefPad
ChefPad
ChefPad

Recommendation and
critiquing
Idea: Smart University Finder!

Can I build a smart tool to find which university to apply?

94
Idea: Smart University Finder!

UniMatch: Building a Personalized University Ranking List


UniMatch

UniMatch: Building a Personalized University Ranking List


UniMatch

UniMatch: Building a Personalized University Ranking List


Interface

UniMatch: Building a Personalized University Ranking List


UniMatch

UniMatch: Building a Personalized University Ranking List


Disucssion

Please share your idea!


Thank you!
Questions?

@mehdielaahi @mehdielaahi

Interested to work as a research assistant?


Contact me! [email protected]
Virtual Reality &
Augmented Reality
Ilona Heldal
26 November 2021
My background:
• Privat
• Hungarian born in Romania, Swedish
citizen, moved to Norway 2016
• 2 kids grown up with virtual reality

• My professional background
• Professor of Informatics, Interactive
Systems at HVL
• HCI, User Experience (UX), Innovation
• PhD 2004 Usability in Collaborative
Virtual Environments
• Mixed Reality Environments
Agenda
• New interfaces: Interaction with new technologies (Chap 7)
• Virtual Reality, and Augmented Reality
• Origins
• Examples
• Important properties
• Interaction
• Immersiveness
• Presence
• Realism
• Why should we study VR (AR, MR, Serious Games)?
Interfaces Basics (for developing and evaluating)
• Command
• Graphical • Similarities
• Multimedia Conceptual models
• Virtual reality (VR) Graphics
• Web Interface metaphors
• Mobile Interaction styles
• Appliance DM direct manipulation
• Voice WIMP (Windows, Icons, Menus, Pointers)
• Pen
• Touch • Differences
• Gesture Common technologies while some others special complex
• Haptic
Expensive vs. affordable
• Multimodal
For ever one for special groups unique
• Shareable
Specialized tools vs everyday applications
• Tangible
• Augmented reality (AR)
• Wearables
• Robots and drones
• Brain computer
Some ea lie connec ion

1979: One of the first speech and gesture interfaces


https://youtu.be/RyBEUyEtxQo

1994: Interactive textbook (Don Norman s)


https://vimeo.com/18687931
Answer the questions (see, listen, observe):
Watch the video and answer the questions:
https://www.youtube.com/watch?v=f9MwaH6oGEY&list=RDCMUC4QZ
_LsYcvcq7qOsOhpAX4A&index=1
1. How old is VR/AR?
2. What is VR?
3. What is AR?
4. What is the difference?
5. Why should we care about so many different interfaces?
Answers from research(ers)
What is VR?
S he land i ion The l ima e di pla
The ultimate displa would of course be a room within which the computer can control the
existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs
displa ed in such a room would be confining and a bullet displa ed in such room would be fatal
Source: ArtMuseum.net. Multimedia: From Wagner to Virtual Reality

Displa as a window into a virtual world


• Improve image generation until the picture looks real
• Computer maintains world model in real time
• User directly manipulates virtual objects
• Manipulated objects move realistically
• Immersion in virtual world via head-mounted display
• Virtual world also sounds real feels real
Brooks, F.P., 1999. What's real about virtual reality?. IEEE Computer
graphics and applications, 19(6), pp.16-27.

Sensorama: One of the first portable AR?


What is AR?
Mixed reality (relation to AR?)
MR systems are designed to give their users
the illusion that digital objects are in the same
space as physical ones.
For this illusion of coexistence, the digital
objects need to be precisely positioned into
the real environment and aligned with the real
objects in real time.
In fact, the precise real-time alignment or
registration of virtual and real elements is a
definitive characteristic of augmented reality
systems.
Costanza, E., Kunz, A. and Fjeld, M., 2009. Mixed reality: A survey. In Human machine interaction (pp. 47-68). Springer, Berlin,
Heidelberg.
Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B.: Recent advances in augmented reality. IEEE Comput. Graph. Appl.
21(6), 34–47 (2001)
Mixed reality continuum
(illustrating the difference between VR and AR)

Milgram, P. and Kishino, F., 1994. A taxonomy of mixed reality visual displays.
IEICE TRANSACTIONS on Information and Systems, 77(12), pp.1321-1329.
Why is important to know the differences?
Apply the learnings Please think about approaches
What is VR? how engineers need to
What is AR? • Design (and develop)
What is possible? • Plan using the interfaces
and what not • Interact with
• Usage contexts
• User properties

Watch the movie and answer the questions above


https://www.youtube.com/watch?v=psWfx1i6BlY
Impo an p ope ie a ocia ed o i al and
a gmen ed

• Interaction
• Immersiveness
• Presence
Technology
• Realism Interface
Human
Interaction
• Navigation

Select the destination


Pointing World
In Miniature (WIM)
List of defined path
Move to destination Wijkmark, C.H., Metallinou, M.M. and Heldal, I., 2021.
Teleportation Remote Virtual Simulation for Incident Commanders—
Cognitive Aspects. Applied Sciences, 11(14), p.6434.
Interpolation
Guided visit metaphor
• Manipulation
• Orientation
Bowman D. A., Kruijff E., LaViola J. J. et Poupyrev I. (2004).
3D User Interfaces : Theory and Practice. Addison Wesley
Manipulation
(ex)

Weise, Matthias, Zender, Raphael and Lucke, Ulrike. "How Can I Grab That?: Solving Issues of Interaction in VR by
Choosing Suitable Selection and Manipulation Techniques" i-com, vol. 19, no. 2, 2020, pp. 67-85.
Orientation
Ex:
for Magic Leap

Magic Leap Patent | Augmented Reality Viewer


With Automated Surface Selection Placement
And Content Orientation Placement
Immersiveness term reserved to technology
Except: Game development

Non-immersive: 2D technologies Technologies allowing the combination of 2D and 3D Immersive:


(with 3D representations in 2D) HMDs,
Immersiveness in computer science is about CAVEs
Presence: Our experiences via our senses

Touch and pressure able to grasp or perceive

Awareness of body balance

Simulating the
sensation of movement

17 (26)
Realism (Discuss when is important and when not)
I had an example from VR for firefighters)
• VR for skills training: • VR for comman training (the film
is only about the training
places)

https://www.youtube.com/watch
https://www.youtube.com/watch ?v=EURrFZqg9Sc&t=2s
?v=FpDeILcupBM
Realism: What is it?

Morpeus explains
https://www.youtube.com/watch?v=t-Nz6us7DUA&t=1s

Think about «presence as electrical signals interpreted by your brain»


Discuss: How we think about «real».
What is real?

Sensory experiences Perception


Where it happen?
Why is important to know the differences?

Use?
Users?
Mobile VR/AR on the market

https://onix-systems.com/blog/top-10-applications-of-ar-and-vr-in-business
How to make VR?
1. https://learn.unity.com/course/create-with-vr?tab=overview
Some take away lessons
• At its best, virtual reality and augmented reality is
• one of the most immersive forms of technologies
• huge number of possibilities
• High user experiences
• Content
• Interaction
• Remembering
• At its worst, VR
• makes its users feel motion sick
• Is expensive
Questions?
[email protected]
References
Literature
Sharp, H., Preece, J., & Rogers, Y. (2019). Interaction Design: Beyond Human-Computer Interaction. Wiley. (Chap 7 for this lecture)

Other Reading
About VR Source: ArtMuseum.net. Multimedia: From Wagner to Virtual Reality.

Brooks, F.P., 1999. What's real about virtual reality?. IEEE Computer graphics and applications, 19(6), pp.16-27.

Costanza, E., Kunz, A. and Fjeld, M., 2009. Mixed reality: A survey. In Human machine interaction (pp. 47-68). Springer, Berlin, Heidelberg.

Milgram, P. and Kishino, F., 1994. A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), pp.1321-1329.

Heldal, I., 2004. The usability of collaborative virtual environments: Towards an evaluation framework (pp. 0187-0187). Chalmers University of Technology

Wijkmark, C.H., Metallinou, M.M. and Heldal, I., 2021. Remote Virtual Simulation for Incident Commanders Cognitive Aspects. Applied Sciences, 11(14), p.6434.

Top 10 AR/VR Applications in Mobile Apps – Onix: https://onix-systems.com/blog/top-10-applications-of-ar-and-vr-in-business

Gartner hype tech for 2021: https://endeavorservice.com/business-and-technology/3-themes-surface-in-the-2021-hype-cycle-for-emerging-technologies/

Virtual cocoon (5 sense VR) https://www.markstechnologynews.com/2009/03/virtual-cocoon-headset-see-smell-and.html

Video Sources
Don Norman s interactive book https://vimeo.com/18687931

Put that there (1979): MIT demonstrates the first speech and gesture interface: https://youtu.be/RyBEUyEtxQo

Introduction to VR & AR https://www.youtube.com/watch?v=f9MwaH6oGEY&list=RDCMUC4QZ_LsYcvcq7qOsOhpAX4A&index=1

Mr Bean https://www.youtube.com/watch?v=psWfx1i6BlY

Flaim Trainer https://www.youtube.com/watch?v=FpDeILcupBM

XVR training scenarios: https://www.youtube.com/watch?v=EURrFZqg9Sc&t=2s

Matrix What is realit https://www.youtube.com/watch?v=t-Nz6us7DUA&t=1s

Learn:
https://learn.unity.com/course/create-with-vr?tab=overview

https://www.youtube.com/watch?v=1VC3ZOxn2Lo&t=642s
LIFELOGGING
Our journey from academic research into
innovation
Duc Tien Dang Nguyen
INFO162 - 2nd November 2021

Duc Tien Dang Nguyen (Đặng Nguyễn Đức Tiến)

Just call me
“Tien”
Associate professor at the Department
of Information Science and Media
Studies, University of Bergen.

Research fields: multimedia forensics,


lifelogging, multimedia retrieval,
computational photography, computer
vision and applied machine learning.

Computer Scientist, Puzzler, and Video


Gamer. Passionate about technology, Office: 02.121 - MCB
recreational mathematics, photography,
video games, and hiking. Email: [email protected]
Contents

• What is lifelogging - From theory to our vision


• The search engines - What we actually did
• The research challenges - Working with the academic
• Get out of the office - Working with the real world

Lifelogging
We have grown comfortable
searching online content….

Now we have our own content… deep life experience data

6 (c) 2018
Depending on what you want to measure,
there is likely to be a device (or app).

7 (c) 2018

Three Motivating Factors


Driven by low-cost, reliable, ubiquitous sensors, massive volumes of personal
sensor data are being created that can fit on a low cost ($100) hard disk.

New forms of multimedia and multimodal data analytics tools means that we can
now begin to extract value from these archives… This is what I am interested in…
applying Information Retrieval tools to ‘deep life experience’ data.

This will become a common activity… as common as wearing


glasses…

8 (c) 2018
9

Dodge and Kitchin (2007), refer to lifelogging as

“a form of pervasive computing, consisting of a


unified digital record of the totality of an
individual’s experiences, captured multi-modally
through digital sensors and stored permanently
as a personal multimedia archive”.

(c) 2018

Life Experience

Lifelog

The individual can have a searchable


lifelog for many aspects of life experience…
activities, experiences, behaviours,
information, biometrics… huge volumes of
sensor data captured passively. It is up to us
to develop the search engines.

10 (c) 2018
Using knowledge for
self-improvement Digitise as much as you can of
through life experience… for many
experimentation reasons (memory, health, etc.).

Biohacking

Lifelogging
Quantified
Self

Sense and analyse factors of


interest through numbers to gain
knowledge
11 (c) 2018

12

The Idea
Through wearable and informational sensors, an
individual will be able to gather a detailed
multimodal personal lifelog that they can use for
many beneficial applications.

Augmented Augmented Augmented


Wellness Cognition Citizen
(c) 2018
Augmented Wellness 13

Personal Insights

Data-driven Health
Interventions

Functionality-replacement

(c) 2018

Augmented Cognition Augmented Memory

Enhanced Productivity

Enhanced Learning

(c) 2018
Augmented Society Citizen Data

Safety

(c) 2018

In reality, we can’t yet imagine many of the use-cases of lifelogs, but they could
become a permanent companion assisting the individual throughout life. Constantly
growing in size and value.

ELDERLY

ADULT

CHILD

1. Knowledge Support 2. Productivity 3. Health


Focus on supporting knowledge From education to the workplace, Into old age, providing support
acquisition and learning in the providing information and for cognition and health to
early years. insights to assist productivity and maintain independence and
fitness. activity.

16 (c) 2018
Lifelogging is an
application of Information
Data Processing User Experience Personal Data
Retrieval, but also
A variety of data, different Develop fixed and ubiquitous The ethics of how to use rich
timings, different accuracies, capture & access methods personal data & doing so in a integrates many more
needing different tools. for all stakeholders. privacy-aware manner. disciplines in order to
develop holistic solutions.
IC S A
I
AL ED

AC ER
N
IN T O MP AN
It is not cleanly defined
AN LTIM

TIO
YT

ER UT
C UM

IVA &
as a document retrieval
MU

CY
PR ICS
Y
PH task. There is not even a

H
RA

ET
G
NO standard for what a
HI
INF ETR

ET
lifelog document is.
R
OR IEV
MA AL

PE MP
CO
TIO

RV UT
Yet it holds promise to
AS ING
N

ME M
enhance the lives of all
IVE

MOEM
RY OR
Search & Retrieval who engage. Hence, it is
Scalable & efficient indexing
Anywhere, Anytime an application of IR for

Y
with contextual querying and
no defined unit of retrieval.
Use-cases need pervasive good purposes and we
access and contextual
querying.
are going to explore it
here.

17 (c) 2018

18 (c) 2018
The search engines

Flickr Public
APPs
Google
Bing
Google+ Social
IMDB
Facebook

Twitter Personal
LinkedIn
Youtube
Messengers Intimate
Slack
SkyScanner iTunes
SMS App Moves Self
StackOverFlow
Data Control Level
QS sms blogs website
Moves data Emails
Local Search SlideShare
phone photos
Partner InMail
Social Networks phone calls Quora
Family
members Skype messages
Search Engines Friends
LinkedIn
Relatives CV
FB posts
Colleagues Google photos
Acquaintances Imdb rates
Stranger
Amazon
Passer-by reviews
Generated
Data
Manual Event
Segmentation in the
Original Microsoft
Sensecam Browser

S. Hodges, L. Williams, E. Berry, S. Izadi, J. Srinivasan, A. Butler, G. Smyth, N Kapur, K. Wood, P. Dourish, A.
Friday. SenseCam: A Retrospective Memory Aid. UbiComp 2006

21 (c) 2018

A first generation lifelog browser that


represents days as events with a single
keyframe shown for every event.

AR. Doherty, AF. Smeaton, K. Lee, and DPW Ellis. Multimodal segmentation of
lifelog data. In Large Scale Semantic Access to Content (Text, Image, Video, and
Sound) (RIAO ’07), Paris, France, France, 21-38.

This event segmentation is an


inherently subjective activity and a
human’s understanding of life events
changes over time. We refer to this as
event decay.

AR. Doherty, C. Gurrin, and AF. Smeaton. An investigation into event decay from
large personal media archives. In Proceedings of the 1st ACM international
workshop on Events in multimedia (EiMM '09). ACM, New York, NY, USA

22 (c) 2018
Events with colour coded minutes/novelty/activities

EyeAware Platform
23

(c) 2018

Life Abstraction
(objects, people, products)
P Kelly, AR Doherty, AF Smeaton, C Gurrin, N O'Connor. The colour of life: novel visualisations of population
lifestyles. ACM Multimedia 2010 pp:1063-1066
24 (c) 2018
Fully Automated

Applying deep learning (AI) to understand what a user sees in real-time…


involuntary retrieval?

25 (c) 2018

A First Generation Search Engine (2012)

A Doherty, K Pauly-Takacs, N Caprani, C Gurrin, C Moulin, N O'Connor and A.F. Smeaton. Experiences of aiding autobiographical
memory Using the SenseCam. Human–Computer Interaction, 27 (1-2). pp. 151-174. ISSN 0737-0024
26 (c) 2018
Epidemiological Studies
The Kidscam food/diet analytics system
supports population-wide analytics and has
been replicated in Beijing, China and in
Melbourne, Australia.
Food marketing - On average, children were
exposed to non-core food marketing 27.3 times
a day (95% CI 24.8, 30.1) across all settings
Alcohol - Children are exposed to alcohol
marketing on 85% of their visits to
supermarkets.
Zhou Q., Wang D., Mhurchu C.N., Gurrin C., Zhou J., Cheng Y. &
Signal, L. et al. Kids’Cam: An Objective Methodology to Study the
Wang H., The use of wearable cameras in assessing children's
World in Which Children Live. American Journal of Preventive
dietary intake and behaviours in China, Appetite (2019), doi: https://
Medicine , Volume 53 , Issue 3 , e89 - e95
doi.org/10.1016/j.appet.2019.03.032.

(c) 2018

Manual

In a Store In the Home

In Daily Life In the School


28 (c) 2018
User: #foodie, #hungry, #yummy, #burger User: #pork, #foodie, #japanese, #chicken, #katsu,
#box, #salad, #restaurant, #bento, #rice, #teriyaki
Machine: #burger, #chicken, #fries, #chips, #ketchup,
#milkshake Machine: #teriyakichicken, #stickyrice, #chickenkatsu,
#whiterice, #teriyaki, #peanutsauce, #ricebowl, #spicy-
chicken, #fishsauce, #bbqpork, #shrimptempura, #ahi-
Figure 1: A comparison of user-provided tags vs. tuna, #friedshrimp, #papayasalad, #roastpork, #sea-
machine-generated tags. In this example, the user weedsalad, #chickenandrice
uses only #burger to describe what they are eating,
potentially not perceiving the fries as worth men- Figure 2: Another comparison of user-provided tags
tioning,
Photos credit:though they are providing subjective
and judge- vs. machine-generated tags. Machine-generated
ment in the form of #yummy. However, machine- tags and
F. Ofli et al., Is Saki #delicious? The Food Perception Gap on Instagram provide more to
Its Relation detail about
Health, the
WWW, food plate such
2017
generated tags provide more detailed factual infor- as the type of rice, i.e., #stickyrice and #whiterice,
mation about the food plate and scene including and the dressing on the food item, i.e., #peanut-
#fries, #ketchup, and #milkshake. sauce.

fined dining experience, can be related to health outcomes, 2. RELATED WORK


but also that such perception di↵erences can be picked up Our research relates to previous work from a wide range
automatically by using image recognition. of areas. In the following we discuss work related to (i) food
The rest of the paper is structured as follows. In the next perception and its relationship to health, (ii) using social
section we review work related to (i) the perception of food media for public health tracking, and (iii) image recognition
and its relationship to health, (ii) using social media for and automated food detection.
public health surveillance, and (iii) image recognition and Food perception and its relationship to health. Due
automated food detection. Section 3 describes the collec- to the global obesity epidemic, a growing number of re-
tion and preprocessing of our Instagram datasets, including searchers have studied how our perception of food, both be-
both our large dataset of 1.9M images used to analyze food fore and during its consumption, relates to our food choices
perception gap and its relation to health, as well as even and the amount of food intake. Here we review a small
Baseline search engine - Eating moments detection
larger dataset of ⇠ 3.7M images used to train and compare
our food-specific image tagging models against the Food-101
sample of such studies.
Killgore and Yurgelun-Todd [19] showed a link between
benchmark. Section 4 outlines the architecture we used for di↵erences in orbitofrontal brain activity and (i) viewing
training our food recognition system and shows that it out- high-calorie or low-calorie foods, and (ii) the body mass in-
performs all reported results on the reference benchmark. dex of the person viewing the image. This suggests a re-
Our main contribution lies in Section 5 where we describe lationship between weight status and responsiveness of the
how we compute and use the “perception gap”. Our quan- orbitofrontal cortex to rewarding food images.
titative results, in the form of indicative gap examples, are Rosenbaum et al. [38] showed that, after undergoing sub-
presented in Section 6. In Section 7 we discuss limitations, stantial weight loss, obese subjects demonstrated changes in
extensions and implications of our work, before concluding brain activity elicited by Raw Data visual cues. Many of
food-related
a
the paper. at
d
these changes in brain areas known to be involved in the
ct
le
col
Feature Vectors

Indexed Database
User in
te
ra
ct

API/Interface

baseline search engine


versus

• 14,760 lifelog images of


eight different foods:
Vietnamese Roll Cake,
Sizzling Cake, Broken Rice,
Fried Chicken, Beef Noodle,
Bread, Salad, and Pizza.

• Each type has a number of


images varied from 2,000 to
3,000.

• Manually annotated and


labeled.
The research challenges

Applications of Lifelog NTCIR-Lifelog ImageCLEF LSC Participants


Epidemiological Studies Participants Participants
using our analytics
The steps

• Build a common dataset


• Set up a common challenge
• Find the teams

• Keep doing it!

let's face a real practical challenge ...

Lacking of common datasets!


Kvarsir
A Multi-Class Image-Dataset for Computer Aided
Gastrointestinal Disease Detection

Hard to find
A crowdsourced data set
of edited images online

t
en

r...
fer
dif

eve
RAISE

ry
ve
is

how
log
A Raw Images Dataset for

life
Digital Image Forensics

Heimdallr
A dataset for sport ...
analysis
on
king
wor
n
b ee
e
h av
e
W

Challenges

What to log?

How often to log?


A willingness to share?
finding people who are willing to donate
years or even decades of data is a
challenge. What NOT to be shared?

Who can access the data?


Principles

The Continuity The Anonymity


Lifelog data of each individual All user-identifiable data have
should be captured to be removed.
continuously for at least 15

The Completeness The Protectiveness.


The lifelogs should contain The dataset should be password
four basic types of protected and all accesses should be
information: visual data, logged. The lifeloggers, researchers or any
personal biometrics, human identifiable individuals who appear in the
activity, and information data can request some content to be
accesses. deleted at any time and need to agree to
use of data beyond any initial planned
dataset release.

Vision
Thought/Attention
Information Needs

Reading Content

Writing Content
Personal data
Media Consumption
is no longer
just our Speech Media Creation
facebook Hearing

postings or Device Interactions


our Biometrics Productivity
cameraphone
images… Social Network Postings

Environmental Data

Profile in other’s lifelogs

Locations Activities
A suite of
non-invasive
technology
to capture all
our
experiences
Loggerman: Privacy-
aware
HCI and information
logging

Reading Content
Z. Hinbarji, R. Albatal, N. O’Connor and C. Gurrin (2016) LoggerMan,
Writing Content a comprehensive logging and visualisation tool to capture computer
usage. In: 22st International Conference on MultiMedia Modelling
Device Interactions (MMM 2016), 4-6 Jan, 2016, Miami, FL

Captured at different frequencies, with different error rates, and in a


huge number of different modalities…

72 Heart Beats

12 GPS locations

12 Physical Activity Logs

2 images

450 keystrokes

0.07 Glucose readings

And so on…

00:00 00:01 00:02 00:03 00:04 00:05 00:06 00:07 00:08 00:09 00:10 00:11
Process

Data Gathering
Individuals' lifelogs are gathered by
following the first two principles.

Dataset Publishing
We protect the data by follow
protectiveness principle, as well as
putting in place a data download
tracking mechanism.

Dataset Organising
Gathered data is cleaned by
removing user-identifiable data
(applying the anonymity principle)
and then organized hierarchically
with the basic units composed from
every minute.

Anonymisation - Data Raw Level

Shared Lifelog

Lifelogger
Review & Clean

Review the Release as Dataset


Lifelog

1. Researcher Review
and Clean Resize
Latitude: xxx
Home 2. Resize (for text)
Longitude: yyy 3. Anonymisation
Anonymisation - Application Level

Storage

Feature

Extraction
User Software

Analytics Engine
Semantic

Enrichment
Insight &
Query Engine

Professional
Analytics
Applications
Four components of the published dataset

LIFELOGGERS RESEARCHER STORAGE DATASET

DATA CLEAN PROTECT MANAGE

Individuals Data System Individuals


Healthy Clean the data again Password protect the Can request data
(one trusted data in zip file and log deletion at any time
Interested researcher) all accesses
Need to agree to use
Fully aware Resize the data to Unique username and of data beyond initial
Control of data make text difficult to password for all plan
read participants

Data Anonymise all data to Use of individual and Researchers


blur faces, remove organisational data
Can request data
Total control of data identifiable material agreements
deletion at any time
gathering

Self-cleaning and
deletion of data

Wearable Multimedia Human Biometrics


24x7 heart rate, galvanic skin
1,500 images per day from
response (mostly), calorie
the Narrative wearable
burn, steps. Daily blood
camera. Accompanying
glucose level & blood

01 02
concept annotations. Manual
pressure. Weekly cholesterol
photos captured. Music
and uric acid measurements.
listening history.
MINUTE AS
THE UNIT OF
RETRIEVAL

03 Information Access Human Activity 04


Onscreen reading, Physical activities (walking,
keystrokes on keyboard, running, transport, etc..),
mouse movements, locations visited, food eaten,
computer activity, web pages mood.
viewed.
The LSC 2020 Dataset

● The provided dataset was used in the 2020 Lifelog Search Challenge,
which contains:
○ Image Dataset (38.49GB) of wearable camera images, fully redacted
in 1024 x 768 resolution, captured using OMG Autographer and
Narrative Clip devices.
○ Metadata for the collection (2.8MB), representing time, physical
activities, biometrics, locations, etc…
○ Visual Concepts (79.9MB) extracted from the non-redacted version of
the visual dataset.

48 Topics were used, such as this…

TITLE: Tower Bridge

DESCRIPTION: Find the moment(s) when I was looking at


Tower Bridge in London

NARRATIVE: To be considered relevant, the full span of


Tower Bridge must be visible. Moments of crossing the
Tower Bridge or showing some subset of Tower Bridge are
not considered relevant

Solution: Visual, Location, Activity


Example Retrieval Query
0, 30, 60, 90, 120, 150… 300 seconds
52

I see Steve Wozniac…

I see Steve Wozniac on a wall of portraits.

I see Steve Wozniac on a wall of portraits. The wall was a brick wall
with a door and large heater.

I see Steve Wozniac on a wall of portraits. The wall was a brick wall
with a door and large heater. I was speaking to an audience before
seeing the photos.

I see Steve Wozniac on a wall of portraits. The wall was a brick wall
with a door and large heater. I was speaking to an audience before
seeing the photos. I left by driving back to work.

I see Steve Wozniac on a wall of portraits. The wall was a brick wall
with a door and large heater. I was speaking to an audience before
seeing the photos. I left by driving back to work. It was in 2015 in
March on a Wednesday.

Lifelog Search Challenge (@ ICMR)


54

Interactive ad-hoc retrieval challenge in front of an audience.


AAU

A Leibetseder, B Muenzer, A Kletz, M Primus and K Schöffmann. liveXplore at the Lifelog Search Challenge
2018. The Lifelog Search Challenge 2018 at ACM ICMR 2018. (2nd highest performing system)

VIRET

J Lokoc, T Souček and G Kovalcik. Using an Interactive Video Retrieval Tool for LifeLog Data. The Lifelog
Search Challenge 2018 at ACM ICMR 2018. (3rd highest performing system)
UU/DCU

A Duane, C Gurrin & W Hürst. Virtual Reality Lifelog Explorer for the Lifelog Search Challenge at ACM
ICMR 2018. The Lifelog Search Challenge 2018 at ACM ICMR 2018. Top Performing System.
Video Retrieval 65

Many of the systems at LSC have previously participated in the VBS - Video
Browser Showdown competition at the MMM conference, which is currently
in its tenth year.

https://videobrowsershowdown.org/

Video Search has many similarities to Lifelog search and as such, participants
have a significant advantage if they have working video search systems.

On the negative, these participants often do not to customise their retrieval


systems enough for lifelog data.

Text Queries 66

Text queries are the standard method of information need input that users
are comfortable with.

All systems integrate text retrieval as a standard query mechanism.

Of course this means that the problem of the semantic gap is in evidence,
necessitating the provision of some form of interactive interface.

Doubtless there are better query mechanisms going to become available,


but now, text is still dominant.

Get out of the office!


Vision
Attention

Initial Idea
Wearable assistive technology
that assists individuals to Hearing Speech
easily capture their 

life experience into a Biometrics
searchable digital memory and
Insights Discovery Memory Support
access the memory to 

assist in daily life.

Locations Activities

Wearable devices to capture


everyday-life-experience

Here were our original business model canvas (v1)


Elderly People
What we did and found

Guess: Customer-facing staff need some memory


support to enhance customer experience.

Interviews with: Front-facing staff at Hilton, Starbucks,


etc.

"Nice to have"

Pivot next market

A customer archetype that we found

Who
- Directors/Owners of Nursing Homes

Role
Mary Smyth
- Ensure safe & efficient operations MSc. BSc, RGN
- Ensure adherence to regulations Director of Care for six
nursing homes.

What matters 30 years nursing


- Meeting regulation requirements experience.
Typically overworked
- Increasing efficiencies
and under daily
- Reducing costs pressure.
- Staff retention Ensure conformance to
regulation.
- Providing a safe environment High responsibility for
staff and residents.
New business modal canvas (v2)

We went out and interviewed people

Over 180 hours of interviewing 100+ face to face interviews

08 01
30+ directors 8 cities
07 02

06 03
60+ care staff 30+ nursing homes
05 04

20+ family members and others 20+ males, 100+ females


Whole Ecosystem - nursing home directors, group owners, regulators, government,
distributors, software companies

We thought quality of life can be improved and


accidents can be reduced…
We actually found this…

Regulation demands

2-3 hours daily data

20% staff turnover

Cost pressures

Staff frustration

Bad data (lies)

Poor existing solutions

How LifeCloud can Solve the Pain

Automatic sensing of the


Activities of Daily Living
in Nursing Homes.
22% Nursing staff cost reduction 01
Adding new
hypotheses
Save staff 2 hours per day 02
Leading to increased staff retention
03
Enhanced compliance with European data policies
04

01 Gain competitive advantages

02 More information about resident satisfaction and quality


of life.
And

03 Reducing the accident rate of residents by 50%


invalidating
other ones

Current version of the BMC (v3)


Pre LifeCloud (Annual nursing costs) Based on Post LifeCloud (Annual nursing costs)
Nurses average size Nurses
nursing homes of
€630,000 50 residents. €415,000
Care Assistants Care Assistants
Technologies
that automatically
€800,000 €687,000
gather and enter
data

€18,000
Total €1,430,000 Total €1,120,000
Heavy workload, stressed staff Reduce workload, happier staff
Hard to meet regulatory requirements Meet all regulatory requirements

Pre LifeCloud (Annual nursing costs) Based on Post LifeCloud (Annual nursing costs)
Nurses average size Nurses
nursing homes of
€630,000 50 residents. €415,000
Care Assistants Care Assistants
Technologies
that automatically
€800,000 €687,000
gather and enter
data

€18,000
Total €1,430,000 Total €1,120,000
Saving €310,000
Heavy workload, stressed staff Reduce workload, happier staff
Hard to meet regulatory requirements Meet all regulatory requirements

Wearable technologies that assist directors of 



nursing homes to reduce nursing costs by 22% 

while meeting all regulatory requirements.
Ecosystem Map
Tusen Takk!
Special thanks to the DCU team:
Cathal Gurrin
Tu-Khiem Le
Van-Tu Ninh
Multi-Device Analysis around Tabletops

Morten Fjeld
t2i Lab, Interaction Design division
CSE, Chalmers, Sweden

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 1/42
2
CV

Ecole Polytechnique Grenoble, ENSIMAG, France (1989)


NTNU, Norway, Applied Mathematics (1990)
ETH Z, PhD in Human-Computer Interaction (2001)
ETH Medal (2002)
Since 2004 at CSE, Chalmers; since 2018 full professor
Visiting teacher & researcher at:
University of Zurich (2008)
NUS Singapore (2011)
Tohoku University (2016 to 2017)
University of Bergen (2016 to 2018, 2019-)
ETH Zurich (2019)

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 2/42
3/12

t2i Lab, August 1st 2017

1 Prof
1 Postdoc
4 PhD students
2 MSc students
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) 16.11.21 ETH Zurich, 12th November 2018 3/42
Background

Large amounts of data are generated from


everyday activities, such as commuting patterns,
food sales, mobile phone usage, and
even indoor cooling- and heating-preferences.
“Big data” is a catchword used by IT professionals
to describe immense data sets; governments and
industry increasingly make use of large data.
When unstructured data is connected specific
places, it can be also called location-based data:
“data relevant for the context of interaction”.

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 4/42
Background

Smart and novel uses of portable devices may


open up access to location-based data for a wider
range of users, enabling ordinary citizens to
rapidly explore and analyze data in ad hoc,
casual settings … as opposed to the highly
technical statistical methods used today.

!
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 5/42
Background

This has significant implications for how people


could learn about and participate in civic life and
how data-driven conversations could
contribute to society’s ongoing evolution.

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 6/42
Presentation overview

Observations, motivation, and RQs


Stage 1: Mobile tabletops, bring your own device
Example A: Exploring phone-tablet combination
Stage 2: Beyond personal devices
Example B: Infovis interaction techniques
Example C: Collaborative analysis
Insights for interaction design / HCI

Connecting distributed teams … more projects …

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 7/42
Observation: History of tabletops

!
!
!

!
! !
!

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 8/42
Observation: History of tabletops

Christian Muller-Tomfelde and Morten Fjeld. 2012. Tabletops: Interactive Horizontal Displays
for Ubiquitous Computing. Computer 45, 2 (February 2012), 78-81.
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 9/42
Observation: Many users own
more than one device (Pew, 2012)

© Pew Research Center, 2012


Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 10/42
Motivation:
Data-analysis around tabletops

Ad hoc settings
and uses of
portable
technologies:
office, camp,
café.

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 11/42
Research questions (RQs)

What is tomorrow’s mobile multi-device interface?


How can we visualize substantial/location-based
data at multi-device interfaces?
How can we leverage emerging mobile sensors?
What are the interaction patterns for complex
data presented at multi-device interfaces?
How can such visualization and interaction add
value to sense making activity?

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 12/42
Tomorrow’s multi-device UI?

Tablet–phone assortment (left), potential combinations


(center) and Dynamic Duo pairing (right)

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 13/42
Tablet-phone taxonomy

Mini-exercise:
1. Suggest a mobile application making
use of phone and tablet device in a
novel and innovative way

2. Use the taxonomy to argue for you idea

3. Also use the taxonomy to explore


alternative I and O alternatives

4. Do you see a relation to eyes-free input?


Tablet–phone combinations coded with
hands to denote input (I) and a white
5. Do you see a relation to button size and screen to denote output (O).
the “The Generalized Perceived Input
Point Model”? Explain

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 14/42
Distributed information display

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 15/42
Distributed control scenarios

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 16/42
Distributed control and position:
circular menu technique

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 17/42
Combining these techniques 1:00

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 18/42
A: Demo of phone-tablet patterns

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 19/42
Stage 2: Beyond personal devices

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 20/42
Mobile information visualization

The three information visualisations studied: Users can


explore i) bar chart, ii) time series plot, and iii) hierarchy
diagram by repositioning the phone relative to the tablet.

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 21/42
Mobile information visualization

! !
Getting ready for new technology:
we emulate embedded spatial sensors
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 22/42
Mobile information visualization 0.59

!
!

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 23/42
B: Demo of infovis techniques, 0:44

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 24/42
Spatially aware device tracking

Getting ready for new technology:


we emulate embedded spatial sensors
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 25/42
Collaborative analysis

Crime mystery
• when
• who
• how
• why

Shared corpus
of data
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 26/42
Alternatives for sensemaking, 0:19

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 27/42
C: Demo of collaborative analysis

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 28/42
Some empirical findings

• TCT better than paper


• TCT better than paper
• Less notes moved
• No impact on workload
• notes moved
• § No impact on workload

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 29/42
Connecting distributed teams;
telepresence

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 30/42
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) 16.11.21 ETH Zurich, 12th November 2018 31/42
Distributed physical tasks; physical
presence

=> Computer vision


=> Augmentation techniques

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 32/42
Tabletop interaction and issue of reach

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) 16.11.21 ETH Zurich, 12th November 2018 33/42
Tabletop interaction and issue reach

Microsoft Surface 2
Surface 2.0 Debut at the 2011
Consumer Electronic Show
In cooperation with
Samsung SUR40
Key Features:
IR sensor integrated
into the LCD PixelSense
40 inch diagonal
1920x 1080 pixels
Depth of 4 inch
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 34/42
Smart and actuated work environments
2.17

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) 16.11.21 ETH Zurich, 12th November 2018 35/42
Smart and actuated work environments
2.17

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) 16.11.21 ETH Zurich, 12th November 2018 36/42
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 37/42
Situated interaction still needed

Christian Muller-Tomfelde and Morten Fjeld. 2012. Tabletops: Interactive Horizontal Displays
for Ubiquitous Computing. Computer 45, 2 (February 2012), 78-81.
Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 38/42
Future work in situated tabletops:
Modular TableTiles

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 39/42
Future work in situated tabletops:
Self-shaping MoldableDisplays

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 40/42
Insights for interaction design/HCI

Tomorrow’s mobile interface will be multi-device


By proof-of-concept we have demonstrated that
key application areas benefit from multi-device
Compact mobile sensors-e.g. compact ultrasonic
spatial sensors-will leverage multi-device UIs
We have examined a few possible data-oriented
interaction patterns for multi-device UIs
We have systematically examined how
collaborative sense making can benefit from
multi-device user interfaces (UIs).

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 41/42
Thanks to research partners

Morten Fjeld, t2i Lab, CSE, Chalmers (www.t2i.se) ETH Zurich, 12th November 2018 42/42
What is Interaction Design?
basics, terms, definitions
concrete examples & real-life insights

Week 1
Lecture Notes

1
Practical Course Information

2
Knowledge you will gain in this course
• knows a definition of interaction design and human-computer
interaction
• knows the concepts of usability, user experience and user-centred
design
• knows the lifecycle model of interaction design
• has knowledge about different kinds of requirements
• knows the key concepts and terms used in evaluation
• has knowledge of different types of evaluation methods

See https://www.uib.no/emne/INFO162 3
Skills you will gain in this course
• can outline and discuss usability goals and user experience goals
for designing an interactive product
• can identify suitable methods for evaluating interactive
technologies
• can identify suitable methods for establishing requirements
• can discuss the conceptual, practical, and ethical issues involved in
evaluation
• can discuss the advantages and disadvantages of low-fidelity and
hi-fidelity prototypes
• can produce simple prototypes of interactive products
See https://www.uib.no/emne/INFO162 4
Course Schedule: Highlights (1/3)
• Introduction: What is Interaction Design?
• User Experience Design
• Interaction & Interfaces Tangible and
Tabletop Interfaces
• Cognitive Aspects – I and II
• Data Gathering & Analysis
• Design & Prototyping
• Evaluating Systems, Interaction, and Users – I and II
• Accessible Interaction Design, Mobile UIs
• Tangible Interaction, VR and AR
• Presentation of Semester Assignment
Touch and
Multitouch Interfaces
5
Course Schedule: Highlights (2/3)
Low-fidelity prototyping
• Introduction: What is Interaction Design?
• User Experience Design
• Interaction & Interfaces
• Cognitive Aspects – I and II
• Data Gathering & Analysis
• Design & Prototyping
• Evaluating Systems, Interaction, and Users – I and II
Information Visualization Design
• Accessible Interaction Design, Mobile UIs High-fidelity prototyping
• Tangible Interaction, VR and AR
• Presentation of Semester Assignment

6
Course Schedule: Highlights (3/3)
Remote Drone Control
• Introduction: What is Interaction Design?
• User Experience Design
• Interaction & Interfaces
• Cognitive Aspects – I and II
• Data Gathering & Analysis
Graphical (GUI) and Tangible (TUI)
• Design & Prototyping
Frameworks
• Evaluating Systems, Interaction, and Users – I and II
• Accessible Interaction Design, Mobile VR,
UIshandhelds, science exploration
• Tangible Interaction, VR and AR
• Presentation of Semester Assignment

7
Course Specifics

• Course web: https://mitt.uib.no/courses/29682


• Lectures every week, Tuesday 14:15 – 16:00
• 4 assignments, done in small groups (3 persons)
• Group assignment, done in small groups (3 persons) (40%
of grade)
• Written exam (60% of grade)

8
About teacher: Floris van den Oever
• Born in The Netherlands
• Bachelors: Applied Psychology, Fontys University of Applied Science,
Eindhoven, The Netherlands(2015)
• Master’s: Human Factors and Engineering Psychology with an
Honours Track in Design, University of Twente, Enschede, The
Netherlands (2019)
• Human Factors researcher at the Royal Dutch Aerospace Centre
(2019-2021)
• PhD on Using Augmented Reality to Facilitate Maritime
Collaboration (2021 - 2025)
23.11.21 9
10
11
About teacher: Morten Fjeld, CV
• Born in Bergen, Swiss and Norwegian citizen, live in Bergen/Gothenburg
• Ecole Polytechnique Grenoble, ENSIMAG (1989)
• NTNU, Norway, Applied Mathematics (1990)
• ETH Z, PhD in Human-Computer Interaction (2001)
• ETH Medal (2002)
• Since 2004 at CSE, Chalmers
• Visited teacher & researcher atUniv. of Zurich (2008),
NUS Singapore (2011), University of Bergen (2016), Tohoku University
(2017), ETH Zurich (2019-2020)
• Full professor at University of Bergen (2020 - )
23.11.21 12
People (1/2)
• Teachers:
• Morten Fjeld [email protected] (Bergen)
Floris van den Oever [email protected] (Bergen)

• Guest teachers and tentative themes:


Duc Tien Dang Nguyen [email protected] (Bergen) (2 hours) (life logging)
• Mehdi Elahi [email protected] (Bergen) (1 hour) (ML, rec.sys.)
• Frode Guriby [email protected] (Bergen) (1 or 2 hours) (project pres.)
• Ilona Heldal [email protected] (Bergen) 1 or 2 hours) (VR/AR/SeriousGames)

• Cathal Gurrin (Tentative) - (life logging, quantified self, personal data)

See the course plan on Mitt.UiB

13
People (2/2)
Course representatives:
– Course participant 1 (UiB student, compulsory course)
– Course participant 2 (UiB student, elective course)
– Course participant 3 (UiB student, from Informatikk)
– Course participant 4 (exchange student)

3 short feedback meetings

You will get experience and a small gift


(e.g. quality chocolate)
14
Reading list
Main book:
– Interaction Design: Beyond Human Computer
Interaction, 5th Edition. Authors: Helen Sharp,
Jennifer Preece, Yvonne Rogers (2019).

Other resources:
– Research Methods on Human Computer
Interaction, 3rd Edition. Authors: Lazar, Feng,
Hochheiser (2017).

– Web links and research paper, including video


material, will be made available the course web
15
Course outline
# Date Topic and lecturer Literature
1 August 24 Introduction: What is Human-Computer Interaction? Oever Ch1.1-1.5, Ch.1.7, Ch2
2 August 31 User Experience Design, Oever Lecture Notes
3 Sept. 7 Interaction & Interfaces, Fjeld Ch3, Ch7
4 Sept. 14 Cognitive Aspects–I, Fjeld/ Oever Ch4, Lecture Notes
5 Sept. 21 Life Logging, Dang Nguyen Lecture Notes
6 Sept. 28 Cognitive Aspects–II, Data Gathering & Analysis, Fjeld/Oever Ch4, Ch8–Ch11
8 Oct. 5 Evaluating Systems, Fjeld Ch14, Ch15, Ch16
7 Oct. 12 Design & Prototyping, Fjeld Ch12, Lecture Notes
9 Oct. 19 Interaction and Users, Fjeld Ch16, Lecture Notes
10 Oct. 26 Machine Learning and Recommender Systems, Elahai Lecture Notes
11 Nov. 2 Tangible Interaction, VR and AR, Oever Lecture Notes
12 Nov. 9 Presentation of Semester Assignment, Oever and Fjeld how to write/present
13 Nov. 16 Presentation of Semester Assignment, Oever and Fjeld how to write/present
16
Seminars
– Registration has opened on Mitt.UiB à Join a seminar
– Groups are led by the teaching assistants group on Mitt.UiB
– 13 seminars
– Semester assignment groups are formed within lab groups
– Chance to receive some guidance on your projects
– Presentation of compulsory assignments
– 80% attendance mandatory
– If you are unable to attend you may be excused
if you have a valid doctor’s note;
please contact [email protected], subject:
– If you have any question about labs in general,
please contact [email protected], subject:

17
Compulsory assignments
– 4 compulsory assignments
– Completed by groups of 3 (same group as the semester assignment)
– Register your group on Mitt.UiB under «People»/«Personar»
– Must be submitted at course web and must be accepted to qualify for exam
– Must be orally presented by your group at your lab group
– May contribute to your semester assignment
– Graded with pass/fail. Don’t count to final grade but must be done to do exam
# Issue Date Short Description Deadline
1 August 23 Form groups, pitch ideas and present similar designs Sept. 6
2 Sept. 6 Identify and explore user needs Sept. 27
3 Sept. 20 Write a rationale for your choices Oct.11
4 Oct. 4 Design a single page presentation of your project Oct. 25 18
Semester assignment, grading
– Interaction design project:
Design a prototype of an interactive product by pursuing a user-centered design
process

– Issued on at course web <date to be defined>


– Solved in groups of 3 students
– Final submission due 26th November
– Weighs 40% on your final grade
– You may present your preliminary work in class 5th or 12th November 2020
Date Assessment Weight
Nov. 24 Project hand-in 40%
Nov. 30 Written exam (3 hours) 60% 19
We will typically answer questions with interest for
more than one student at the course web;
announcements.

20
Introduction to Interaction Design

21
History of interactive (enterprise) technology

https://infostory.com/2013/09/15/timeline-of-enterprise-technology/
23
Colossus (1943-1945) ENIAC (1943)

24
SketchPad (1962)

by Ivan Sutherland

25
Xerox Star (1981)

26
Microsoft 1.0 (1985)

27
28
How do you optimize the users’ interactions with
a system, an environment or a product?

– Taking into account what people are good and bad at

– Considering what might help people with the way they currently do things

– Thinking through what might provide quality user experience

– Listening to what people want and getting them involved in the design

– Evaluating interaction patterns and gathering system requirements


29
Interaction design is (1/3)

“ Designing interactive products to support the way


people communicate and interact in their
everyday and working lives.
” Preece et al. (2015)

30
Interaction design is (2/3)

“ Interaction design defines the structure and


behaviors of interactive products and services,
and user interactions with those products and
services.
” Interaction Design Association: IxDA.org

31
Interaction design is (3/3)

32
Human computer interaction is (1/3)
“a discipline concerned with the design, evaluation and implementation of
interactive systems for human use and with study of major phenomena
surrounding them.”

(ACM SIGCHI, 1992, p. 6)

33
HCI is about... (2/3)
– Understanding users
– Understanding user’s tasks
– Understanding the surrounding environment
– GUI requirements gathering and analysis
– Design prototype
– Evaluate system

34
HCI is... (3/3)

35
User experience (UX) (1/6)
The user experience is the level of satisfaction that
your system provides to every visitor.

It’s a “person’s perceptions and responses resulting


from the use and/or anticipated use of a product,
system or service.” (ISO 9241-210)

36
UX (2/6)

How people feel about a product and their pleasure and satisfaction when using
it, looking at it (…) [including] their overall impression of how good it is to use (…)

Preece et al. (2015)

37
UX (3/6)
Evaluate how users feel about a system, looking at;
– ease of use,
– perception of the value of the system,
– utility,
– efficiency in performing tasks and so forth,

The system could be a website, a web application or desktop software.


– How do users feel?
– Is it pleasant to use?
– Does this product give them value?
– Is it easy to use?
by Dr. Donald Norman

38
UX example (4/6)

39
UX example (5/6)

40
User experience honeycomb (6/6)
fill a need

simple attractive

se

ae s
ou

the
yt

tics
e as
on

dis
ati

ab
vig

ilit
na

y
inclusive

trustworthy 41
Accessibility
– refers to the degree to which an interactive product is accessible by as many
people as possible.

– a focus is on people with disabilities, but also on low-age (children) and high-age

People are considered to be disabled if;


– They have a mental or physical impairment
– The impairment has an adverse effect on their ability to carry out normal day-to-day activities
– The adverse effect is substantial and long term (meaning it has lasted for 12 months, or is likely
to last for more than 12 months or for the rest of their life)

42
From Usability goals to User experience goals

43
Usability (1/8)
“the extent to which a product can be used by specified users to achieve specified
goals with effectiveness, efficiency and satisfaction in a specified context of use.”

(ISO 9241-151)

44
Usability (2/8)
According to Jakob Nielsen, usability is defined by 5 quality components:

1. Learnability
2. Efficiency
3. Memorability
4. Errors
5. Satisfaction

Jakob Nielsen

45
Usability (3/8) - why users leave?
If;
– they are forced to register,
– they face with bad navigation,
– they face long check-out process,
– they face long form filling,
– there is lack of information in the interface,
– there is long loading times,
– they face suddenly appearing pop-up messages or pop-up videos,

46
Usability – keep in mind (4/8)
– Use usability guidelines but do not rely only them – always test with users.

– Ensure that the sequences of actions to achieve a task are as simple as possible.

47
Usability – keep in mind (5/8)
– Ensure that the user can always get out, go back or undo an action.

– Ensure that response time is adequate.

48
Usability – keep in mind (6/8)
– Ensure that the UI’s appearance is uncluttered.

– Consider the needs of different groups of users.


– Provide all necessary help
49
Usability goals (7/8)

Usability is broken down into the following goals;

1. Effective to use (effectiveness)


2. Efficient to use (efficiency)
3. Safe to use (safety)
4. Having good utility (utility)
5. Easy to learn (learnability)
6. Easy to remember how to use (memorability)
50
User experience goals (8/8)
Desirable and undesirable aspects of the user experience:

Desirable aspects;
Satisfying, helpful, fun, enjoyable, motivating, provocative, engaging, challenging,
surprising, pleasurable, enhancing sociability, rewarding, exciting, supporting
creativity, emotionally fulfilling, entertaining, cognitively stimulating

Undesirable aspects;
Boring, unpleasant, frustrating, patronizing, making one feel guilty, making one feel
stupid, annoying, cutesy, childish, gimmicky

51
How do we create usable designs?

52
Norman’s design principles

– Visibility
– Feedback
– Constraints
– Consistency
– Affordance

53
– Visibility
– Feedback

Visibility – Constraints
– Consistency
– Affordance

– Displaying possible actions the user can make within


the interface
– Visible functions are more likely to be used
– Actions that are less visible or hidden are less likely to
be used

54
– Visibility
– Feedback

Feedback – Constraints
– Consistency
– Affordance

– Communicates the results of an interaction

– Communicated to the user by the system through


– Auditory cues
– Visual cues
– Tactile cues (e.g. vibration)
– Verbal cues

55
– Visibility
– Feedback

Constraints – Constraints
– Consistency
– Affordance

– The design concept of constraining refers to determining ways of restricting the


kinds of user interaction that can take place at a given moment.

56
– Visibility
– Feedback

Consistency – Constraints
– Consistency
– Affordance

– Following a design “language” increases consistency


between different systems

– Benefits the user:


– Increases learnability
– Eliminates confusion

57
– Visibility
– Feedback

Affordances – Constraints
– Consistency
– Affordance

– To afford means ’to give a clue’

– Affordances are all the possible actions


we can do with some object

I’m a Button, believe you me

Don Norman on affordances, 1994


58
– Visibility
– Feedback

Affordances – Constraints
– Consistency
– Affordance

– Have suggestions or clues about to how to use these properties.

– Can be dependent on the


– Experience
– Knowledge
– Culture of the actor
– Can make an action easy or difficult

59
– Visibility
– Feedback

Types of affordances – Constraints


– Consistency
– Affordance

– Explicit (perceived)
– False
– Hidden

60
– Visibility
– Feedback

False affordances – Constraints


– Consistency
– Affordance

– happens when there is in fact affordance but there is no function to it.

61
Alternative approach: Metaphors in Interaction Design, Jim Alty

Alty, James L., Roger P. Knott, Ben Anderson, and Michael Smyth. "A framework for engineering metaphor at the user interface.”
Interacting with computers 13, no. 2 (2000): 301-322. https://tinyurl.com/sodp2jv. 62
User centered design

64
User-centered design

– an approach to design that focuses on users in each


phase of the design process

– a lifecycle process that puts an early emphasis on


user and task analysis and user involvement in the
design and testing of a product

– also called as usability engineering


65
User-centered design: It is iterative

66
User-centered design activities

67
Concrete examples & real-life
insights
68
69
Issues of reach within collaboration around large displays
Ortholumen, t2i Lab, Chalmers, 2007, Tommaso Piazza

71
The role of user interfaces in our lives
– text

72
The role of user interfaces in our lives
– text

73
The role of user interfaces in our lives
– text

Microsoft abdiserer: Google kan overta tronen (4th Oct. 2019);


https://www.aftenposten.no/kultur/nyhetsanalyse/i/dOzmBz/microsoft-abdiserer-google-kan-overta-tronen 74
The role of user interfaces in our lives (2016); Alex Danc

75
The role of user interfaces in our lives
– text

76
The role of user interfaces in our lives
– text

77
Apple, TagesAnzeiger (a newspaper, 2018)
– te

Tagesanzeiger, Freitag 18. Januar 2019 10:42


Was kommt nach dem Smartphone-Boom?
Die Nachfrage nach Handys hat ihren Zenit überschritten. Die Hersteller
stellt das vor grosse Probleme – ein Blick auf Samsung, Apple und Huawei.

https://m.tagesanzeiger.ch/articles/11016672 78
DynamicDuo, 2008:
Making tablets and phones interact
DynamicDuo, t2i Lab, Chalmers, 2011, Tommaso Piazza

80
Application sharing, e.g. DeepShot (2011):

Chang, T.-H. and Li, Y. (2011) Deep shot: a framework for migrating tasks across devices using mobile phone cameras. In
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, pp. 2163–2172.
81
Application sharing, e.g. CapCam (2016):

Xiao, R., Hudson, S. and Harrison, C. (2016) Capcam: enabling rapid, ad-hoc, position-tracked interactions between
82
devices.
In Proceedings of the International Conference on Interactive Surfaces and Spaces, ACM, New York, NY, USA, pp. 169–178.
Takuma Hagiwara, Kazuki Takashima, Morten Fjeld, Yoshifumi Kitamura, CamCutter: Impromptu Vision-Based Cross-Device
Application Sharing, Interacting with Computers, , iwz035, https://doi.org/10.1093/iwcomp/iwz035
83
PDF: https://www.dropbox.com/s/xfq99q84xp6ao9s/CamCutter_IWC_2019.pdf?dl=0.
Textbook: Activity 1.2, p. 14
Handoff by Apple works
between spatially close iOS
devices and OSX devices
signed into the same iCloud
account. Each device must
have Bluetooth turned on
and be connected to the
same Wi-Fi network. A user
opens and starts a task with
a compatible app, like Mail or
Pages, and can then switch
to a nearby device. On the
nearby device, which must
be locked, the user logs in Apple Handoff:
and picks up where s/he https://support.apple.com/guide/mac-help/hand-off-between-devices-mchl732d3c0a/mac
hadleft off. …. ease-of-use?
84
CamCutter: Impromptu Vision-Based
Cross-Device Application Sharing

Takuma Hagiwara, Kazuki Takashima, Morten Fjeld, Yoshifumi Kitamura, CamCutter: Impromptu Vision-Based Cross-Device
Application Sharing, Interacting with Computers, , iwz035, https://doi.org/10.1093/iwcomp/iwz035 85
PDF: https://www.dropbox.com/s/xfq99q84xp6ao9s/CamCutter_IWC_2019.pdf?dl=0.
CamCutter: Impromptu Vision-Based
Cross-Device Application Sharing

Takuma Hagiwara, Kazuki Takashima, Morten Fjeld, Yoshifumi Kitamura, CamCutter: Impromptu Vision-Based Cross-Device
Application Sharing, Interacting with Computers, , iwz035, https://doi.org/10.1093/iwcomp/iwz035 86
PDF: https://www.dropbox.com/s/xfq99q84xp6ao9s/CamCutter_IWC_2019.pdf?dl=0.
MiniAssignment

• Go to a breakout room with your assignment group


• Discuss future and meaningful uses of pairing of two mobile devices
• Discuss future and meaningful uses of pairing of mobile and fixed devices

• Come back in 5 minutes!

87
Different kind of knowledge and skills

William Blake (1757-1827)


https://poets.org/poem/proverbs-hell 89
User Experience
INFO262: Interaction Design
2021-08-31
Reading material

• (Two pages on) Technology as Experience by


McCarthy and Wright (2004).
• The full book was used in making this PPT.
• User experience (UX): towards an experiential perspective on
product quality. In Proceedings of the 20th Conference on
l'Interaction Homme-Machine(IHM '08) by Hassenzahl (2008).
• Highly relevant for further studies and work.
Outline

• Phenomenology of experience
• User experience, a history and a definition by Hassenzahl
• A brief history of HCI and ‘the user’
• User experience design
A P
E
B R
R S
I
O
E
F
N
A
H L
I
S C
T O
O M
R P
Y U
T
O
F
E
R
S
6
The Experience of Living w i t h Technology

“The old computing was about what computers could do; the
new computing is about what users can do. Successful
technologiesare those that are in harmony with users’ needs. They
must support relationships and activities that enrich theusers’
experiences.” (Schneiderman, 2002, p. 2)
BTW, this is how you cite
a quote in according to
APA
Experience

• Experience as meaningful, personally encountered events

• “Opplevelse”: the sensory, felt experience of events


• “Erfaring”: knowledge gained through events
Technology as Experience

“In order to do justice to the wide range of influences that


technology has in our lives, we should try to interpret the
relationship between people and technology in terms of the felt life
and the felt or emotional quality of action and interaction.”
(McCarthy and Wright, 2004, p27)
Technology as Experience

“Experience is difficult to define because itis reflexive and as ever-


present as swimming in water is to a fish.” (McCarthy and Wright,
2004, p15)
What is User Experience?
P r e e c e e t al. (2015)
International Standards Organization

• ISO 9241: Ergonomics of human-system interaction —Part 210: Human-


centered design for interactive systems

• UX is "[a] person's perceptions and responses resulting from the use


and/or anticipated use of a product, system or service"

• Note to entry: User experience includes all the users' emotions, beliefs,
preferences, perceptions, physical and psychological responses, behaviours
and accomplishments that occur before, during and after use.
H a s s e n z a h l ’ s UX definition (1/2)

“[UX is] a momentary, primarily evaluative feeling (good-bad)


while interacting with a product or service.”
Hassenzahl contd.

Pragmatic quality
• refers to the product's perceived ability to support the
achievement of "do-goals", such as "making a telephone call",
"finding a book in an online-bookstore", "setting-up awebpage."
• Pragmatic quality calls for a focus on the product – its utility and
usability in relation to potential tasks.

• Pragmatism: practical effects of conceptions andobjects


Hassenzahl contd.

Hedonic quality
• refers to the product's perceived ability to support the
achievement of "be-goals", such as "being competent", "being
related to others", "being special”

• Hedonism: "the pursuit of pleasure and intrinsic goods are the


primary or most important goals of human life.” (Wikipedia,
ND)
Smartphones can do lots of
things. They can also be used
to achieve intrinsic goals.
This smartphone is very
expensive. How does this
luxurious property of the object
affect the userexperience?
Charlie,
a total hedonist
What makes
him a hedonist?
This news story shows
some kids escaping their
class’ camping trip tofind
reliable internet connection
so they can maintain their
Snapchat “streaks”
Although collecting Snapchat streaks
may seem ridiculous and even harmful
to adults, for these kids it represents a
very real and important part of their
everyday experiences.
Maslow’s Hierarchy of Needs(1943)
H a s s e n z a h l ’ s UX definition (2/2)

"Good UX is the consequence of fulfilling the human needs for


autonomy, competency, stimulation (self-oriented), relatedness,
and popularity (others-oriented) through interacting with the
product or service (i.e., hedonic quality). Pragmatic quality
facilitates the potential fulfilment ofbe-goals.”

Hedonic

Pragmatic
Hassenzahl’s definitions united

“[UX is] a momentary, primarily evaluative feeling (good-bad)


while interacting with a product or service.”

"Good UX is the consequence of fulfilling the human needs for


autonomy, competency, stimulation (self-oriented), relatedness,
and popularity (others-oriented) through interacting with the
product or service (i.e., hedonic quality). Pragmatic quality (do-
goals) facilitates the potential fulfilment ofbe-goals.”
A brief history of ‘the user’
70s-80s: Cognitivist models

• 'The user’ assumed as a person


sitting in front of a computer
screen and keyboard
performing mundane, well-
described tasks.

• Generalized interaction and


reduced complexity of people
to provide a simplistic model
for developing usable systems.
Card, Moran & Newell(1983)
80-90s: The User as a Social Actor

• Everyday activity included in


understanding sensemaking of
human-computer interaction.

• Views all action as “richly


contextualized” – hence an
interest for the human
experience of using computing
technology
90-00s: Consumers and ‘UX’

• New elements of human life


are included in the human-
computer interaction such as
culture, emotion and
experience.

• Applications characterized as
“non-work, non-purposeful,
non-rational” (Bødker, 2006)
The age of “UX”

• User experience has been


operationalized as a ”craft” or
”skill” for interface designers
and developers.

• “UX” gives more search results


than “interaksjonsdesign” on
Finn.no
User Experience Design
Can you design a user experience?

No, because UX is, besides the designed product, also


about the user and the context. You can design neither.
You could design for a user experience

You can aim to fulfill the goals of users in certain


contexts
Start with a “why?” (the “be”-goals)

“…hedonic quality refers to the product's perceived


ability to support the achievement of "be-goals", such as
"being competent", "being related to others", "being
special"” (Hassenzahl, 2008, p2)

Keep in mind the relevant context


Then, consider what your design
should accomplish (the “do”-goals)
“Pragmatic quality refers to the product's perceived
ability to support the achievement of "do-goals", such as
"making a telephone call", "finding a book in an online-
bookstore", "setting-up a webpage." Pragmatic quality
calls for a focus on the product – its utility and usability
in relation to potential tasks” (Hassenzahl, 2008, p2)

Keep in mind the relevant context


Finally, you may consider technology,
concrete design and implementation.
“On the lowest level of the hierarchy are motor-goals”.
(Hassenzahl, 2010, p12). For example, click the search
button for a book on an online bookstore or dialing a
number for a telephone call
UX: Why? What? How? (Hassenzahl, 2010, p12)
Exercise: Critically evaluate a product

Choose product humans can interact with and discuss:


1. What are (some of) its be-goals?
2. What are (some of) its do-goals?
3. What are (some of) its motor-goals?
4. How could it be improved in achieving these goals?
When designing, you’ll make lots mistakes
“It’s really hard to design products by focus groups. A lot
of times, people don’t know what they want until you
show it to them.” (Jobs, N.D.)

“A common mistake that people make when trying to


design something completely foolproof is to
underestimate the ingenuity of complete fools.”
(Addams, 2009, p113)
Design case: Grinding coffee beans
Balancing the utility of automation
with the experience of manual labor
This device combines the utility of
automatic processes with the
embodied experience of manuallabor.
Get it?

This is not really about


pragmatic qualities!

But to combine different


utilities of objects and
processes to create a
richer experience.
Main takeaways

• Computing is ubiquitous - it is embedded in our surroundings and


woven into our everydaylives
• Context of use plays an important factor in the user experience (and
usability)
• User experience is an “evaluative feeling” – ofpragmatic and hedonic
qualities (”do”-goals and “be”-goals)
• When designing for an experience, one can start the process by asking
”why?” to situate ones ideas around a supposed experience
• Dare to see beyond pragmatic qualities
Cognitive Aspects - II

Week 6

Lecture Notes

1
Outline
– What is cognition?
– Describe how cognition has been applied to interaction design
– Attention
– Perception
– Memory
– Reading, speaking and listening
– Problem-solving, planning, reasoning and decision-making
– Mental models
– Data collection and privacy

2
INFO262 - Interaction Design, Spring 2019, University of Bergen
Cognition

3
What is cognition?
– Different kinds of cognition
thinking, remembering, learning, daydreaming, decision making, seeing, reading, writing, etc

– Norman distinguishes between two general modes:


(1) experiential [fast thinking] cognition
a state of mind in which we perceive, act, and react to events around us intuitively and
effortlessly.

(2) reflective [slow thinking] cognition


involve mental effort, attention, judgment, and conscious decision making.

4
Cognitive processes
– Attention
– Perception
– Memory and learning
– Reading, speaking and listening
– Problem-solving, planning, reasoning and decision-making
– Mental models

5
Attention

6
Attention
– selecting things to concentrate on, at a point in time, from the range of
possibilities available.

– involves our auditory and/or visual senses.

– has very limited capacity.

7
Attention
Task:
– Find a specific address with a city

They can pay attention to one thing and filter out


all other stimuli.
– This is called selective attention.

People are easily distracted in many situations.


– Their attention can often be pulled away from
what they’re focusing on.

8
Multitasking and attention
– Do we attend two different things or events at the same time?

– Ophir et al (2009) compared heavy vs light multi-taskers


– heavy multi-taskers were more prone to being distracted than those who
infrequently multitask
– heavy multi-taskers were easily distracted and find it difficult to filter
irrelevant information

9
Design implications for attention
– Make information salient when it needs attending to

– Use techniques that make things stand out like color, spacing , underlining,
sequencing and animation

– Avoid cluttering the interface with too much information

– Form fill-ins that have simple and clean interfaces are easier to use

10
Data collection related to attention
– Eye tracking
– Think aloud protocols
– Questionnaires
I look in the top
– Performance measures (reaction time, task right corner for a Still hasn’t
seen the
completion time, web analytics) menu… Now I click
notification…
the flashing
button…

11
Perception

12
Perception
– The process of organizing and interpreting information, enabling us to recognize
meaningful objects and events.

– Perception refers to how information is acquired


from the environment via the different sense
organs and transformed into experiences of
objects, events, sounds, and tastes.

13
Perception
– How information is acquired from the world and transformed into experiences

– Design representations that are readily perceivable, e.g.


– Text should be legible

– Icons should be easy to distinguish and read

– Use white spaces

14
Gestalt principles
A set of laws, describing how humans typically see objects by grouping
similar elements, recognizing patterns and simplifying complex images.

These theories became known as the Gestalt principles

15
Gestalt principles - proximity
The relative distance between objects in a display affects our perception of
whether and how the objects are organized into subgroups.

Items that are near each other tend to be related.

16
Gestalt principles - proximity
Designers often separate groups of on-screen control- and data-displays

• by enclosing them in group boxes or

• by placing separator lines between groups

• by spacing

17
18
19
Original

increased
conversion
by 10%

Alternative

A/B testing
20
Gestalt principles - similarity
Objects that look similar appear grouped, all other things being equal.

Similarity based on shape, color, and size.

21
22
23
24
Gestalt principles - continuity
elements that are arranged on a line or curve are perceived to be more
related than elements not on the line or curve.

25
26
27
Gestalt principles - closure
Individuals perceive objects such as shapes, letters, pictures, etc., as
being whole when they are not complete.

Specifically, when parts of a whole picture are missing, our perception


fills in the visual gap.

28
29
Gestalt principles - figure ground
People instinctively perceive objects as either being in the foreground
or the background.

30
31
32
Fitts law
“Fitts’ Law (1954) describes the relationship between movement time, distance,
andaccuracy for people engaged in rapid aimed movements.” (Soukoreff &
MacKenzie, 2004)

“The amount of time required for a person to move a pointer (e.g., mouse cursor) to a
target area is a function of the distance to the target (D) divided by the size of the
target (W). Thus, the longer the distance and the smaller the target’s size, the longer
it takes.” (Interaction Design Foundation, ND)

33
Design implications for perception
– Icons should enable users to readily distinguish their meaning

– Bordering and spacing are effective visual ways of grouping information

– Sounds should be audible and distinguishable

– Speech output should enable users to distinguish between the sets of spoken words

– Text should be legible and distinguishable from the background

– Tactile feedback should allow users to recognize and distinguish different meanings

– Consider accessibility for the sensory limited


34
Data collection related to perception
– Eye tracking
– Think aloud protocols
– Questionnaires
I look in the top
– Performance measures (reaction time, task right corner for a Still hasn’t
seen the
completion time , web analytics) menu… Now I click
notification…
the flashing
– Interviews & focus groups button…

35
Human Memory and Learning

36
Human memory
– Involves first encoding and then retrieving knowledge.

– We don’t remember everything


– Memory encoding involves filtering and processing what is attended to
– Memory retrieving is fallible

– Context is important in affecting our memory (i.e. where, when)

37
Processing in memory
– Encoding is first stage of memory
– determines which information is attended to in the environment

– The more attention paid to something…

– The more it is processed in terms of thinking about it and comparing it with other
knowledge…

– The more likely it is to be remembered


– e.g. when learning about HCI, it is much better to reflect upon it, carry out exercises,
have discussions with others about it, and write notes than just passively read a book,
listen to a lecture or watch a video about it

38
Memory
– We recognize things much better than being able to recall things

– We remember less about objects we have photographed than when we observe


them with the naked eye

39
Recognition versus recall
– Command-based interfaces require users to recall from memory a command
from a possible set of 100s

– GUIs provide visually-based options that users need only browse through until
they recognize one

– Web browsers, etc., provide lists of visited URLs, song titles etc., that support
recognition memory

40
41
Learning
– Can be intentional and incidental

– It’s easier to learn by interacting with your design than by manual

– It’s hard to design for intentional learning

42
Design implications for memory and learning
– Don’t overload users’ memories with complicated procedures

– Design interfaces that promote recognition rather than recall

– Provide users with various ways of encoding information to help them remember

– e.g. categories, colour, flagging, time stamping

– Give users the room to learn by inviting exploration of your design, with undo
buttons, tutorials, easy saving, examples, constraining users to appropriate actions

43
Data collection related to memory and learning
Should’ve
clicked
– Think aloud protocols Blablabla
«Tools»…
– Observation
– Performance measures (reaction time, task
completion time, number of errors, web analytics)
– Questionnaires
– Interviews & focus groups

44
Reading, speaking and listening

45
Design implications for reading, speaking and listening
– Use the appropriate medium

– Keep it short

– Provide opportunities to manipulate size, volume, speed etc.

– Consider communicating in multiple channels concurrently (e.g.


visuals or infographics besides the text)

46
Data collection related to memory and learning
– Performance measures (reaction time, task
completion time, number of errors , web analytics)
– Think aloud protocols
– Observation I scroll the menu Should’ve
looking for a clicked
– Questionnaires copying tool … I «Tools»…
– Interviews click «Edit»

47
Problem-solving, planning, reasoning
and decision-making
48
Problem-solving, planning, reasoning and decision-making
– These are processes of “higher cognition” that require reflection and
deliberation.

– It’s easy to overload your user with information

– People typically work with heuristics

49
Design implications for problem-solving, planning, reasoning
and decision-making
– Make key info highly salient

– Provide information and help pages

– Make things simple and memorable

– Constrain the amount of information given to what’s useful

50
Data collection related to problem-solving, planning, reasoning
and decision-making
– Performance measures (reaction time, task
completion time, number of errors , web analytics)
This seems to
– Think aloud protocols be a hard
problem
– Observation
– Questionnaires
– Interviews & focus groups

51
Mental models

52
Mental models
– Is someone’s “internal representation of the relations between a set
of elements” (APA Dictionary of Psychology, ND)

– In design “a mental model of a system or product would include its


various attributes, rules for operation and handling, and expectations
regarding use and consequences and would be used to guide the
individual’s interactions with the system or product in question.” (APA
Dictionary of Psychology, ND)

Jacob Nielsen: “What the user believes about the system at hand.”

53
Design implications for mental models
– Give clear and easy-to-follow instructions

– Provide (online) help and tutorials

– Help people understand why and how things happened (especially when things go wrong)

– Manage affordances

– Follow conventions

– Don’t make too many user interaction changes at once

– If there’s a mismatch between design and mental model, change the design. If that’s not possible, help the
user build the right mental model
54
Data collection related to mental models
– Card sorting
– Participatory wireframing and prototyping
– Competitor research

55
Data collection and privacy

56
Data collection and Privacy

57
Key aspects of data collection and privacy
– Follow GDPR

– Collect no more data than you need

– Inform participants of what data you want to collect, how you want to collect it, and what you’ll do with it

– Let participants read and sign an informed consent form

– Anonymize what can be anonymized

– Save data somewhere where only designated people can get it

– Save data for no longer than you need it

58
UNIVERSITY OF BERGEN

INFO162 – Introduction to
Human-Computer Interaction
Summary of the course

Floris van den Oever


UNIVERSITY OF BERGEN

Agenda
• Guest lecture “What’s a start-up?”
• Break
• Course overview
• Lectures highlights
• Studying tips
Course overview
UNIVERSITY OF BERGEN

# Date Topic and lecturer


1 August 24 Introduction: What is Human-Computer Interaction? Oever
2 August 31 User Experience Design, Oever
3 Sept. 7 Interaction & Interfaces with a research example, Fjeld
4 Sept. 14 Data Gathering & Analysis, Fjeld
5 Sept. 21 Empirical Methods, Fjeld
6 Sept. 28 Cognitive Aspects, Oever
7 Oct. 5 Design & Prototyping with a research example, Fjeld
8 Oct. 12 Evaluation (Fitts’ Law, Hick’s Law) & research example, Fjeld
9 Oct. 19 Machine Learning and Recommender Systems, Elahai
10 Oct. 26 Virtual Reality (and some Augmented Reality), Heldal
11 Nov. 2 Life Logging, Dang Nguyen
Knowledge you will gain in this course
• knows a definition of interaction design and human-computer
interaction
• knows the concepts of usability, user experience and user-
centred design
• knows the lifecycle model of interaction design
• has knowledge about different kinds of requirements
• knows the key concepts and terms used in evaluation
• has knowledge of different types of evaluation methods

See https://www.uib.no/emne/INFO162
5
Skills you will gain in this course
• can outline and discuss usability goals and user experience
goals for designing an interactive product
• can identify suitable methods for evaluating interactive
technologies
• can identify suitable methods for establishing requirements
• can discuss the conceptual, practical, and ethical issues
involved in evaluation
• can discuss the advantages and disadvantages of low-fidelity
and hi-fidelity prototypes
• can produce simple prototypes of interactive products
6 See https://www.uib.no/emne/INFO162
1: what’s HCI?
History of interactive (enterprise)
technology

https://infostory.com/2013/09/15/timeline-of-enterprise-technology/
How do you optimize the users’ interactions with
a system, an environment or a product?

– Taking into account what people are good and bad at

– Considering what might help people with the way they currently do things

– Thinking through what might provide quality user experience

– Listening to what people want and getting them involved in the design

– Evaluating interaction patterns and gathering system requirements


9
HCI is about... (2/3)
– Understanding users
– Understanding user’s tasks
– Understanding the surrounding environment
– GUI requirements gathering and analysis
– Design prototype
– Evaluate system

10
User-centered design: It is iterative

focuses on
users in each
phase of the
design process

11
2: UX design
UNIVERSITY OF BERGEN

UX & Usability
According to the International Organization for
Standardization:

UX is “a person’s perceptions and responses resulting from


the use and/or anticipated use of a product, system or service.”

Usability is “the extent to which a product can be used by


specified users to achieve specified goals with effectiveness,
efficiency and satisfaction in a specified context of use.”
UNIVERSITY OF BERGEN

Usability

1. Learnability
2. Efficiency
3. Memorability
4. Errors
5. Satisfaction
UNIVERSITY OF BERGEN

Design principles

– Visibility
– Feedback
– Constraints
– Consistency
– Affordance
• All possible actions with an
object
Hassenzahl’s definitions united

“[UX is] a momentary, primarily evaluative feeling (good-bad)


while interacting with a product or service.”

"Good UX is the consequence of fulfilling the human needs for


autonomy, competency, stimulation (self-oriented), relatedness,
and popularity (others-oriented) through interacting with the
product or service (i.e., hedonic quality). Pragmatic quality (do-
goals) facilitates the potential fulfilment of be-goals.”
Histor y of the «user»
• 70s-80s: Cognitivist models
• 80-90s: The User as a Social
Actor
• 90-00s: Consumers and ‘UX’
• 00-??: The age of “UX”

Card, Moran & Newell(1983)


Main takeaways
• Computing is ubiquitous - it is embedded in our surroundings and
woven into our everydaylives
• Context of use plays an important factor in the user experience (and
usability)
• User experience is an “evaluative feeling” – ofpragmatic and hedonic
qualities (”do”-goals and “be”-goals)
• When designing for an experience, one can start the process by asking
”why?” to situate ones ideas around a supposed experience
• Dare to see beyond pragmatic qualities
3: Interaction &
interfaces
UNIVERSITY OF BERGEN

Design goals -> problem space -> design space


1. What you (and users) want the product to be.
2. What you want (or can) create; assumptions about users;
what you want the product to achieve
3. What kind of interface, behaviour, functionality to provide

AKA design-space
21
4: Data gathering and
analysis
What is a requirement?
– A statement about an intended product that specifies what it should do or how to
do it

– It must be specific, unambiguous and clear

– E.g, ”a specific button must enable printing of the contents of the current screen”

– It helps us move from problem space to design space

Functional: What a system must do


Non-functional: How well the system performs
23
UNIVERSITY OF BERGEN

Data gathering and analysis


• Interviews • Task description
• Focus groups • User description (persona)
• Questionnaires • Use case
• Observation – User stories
• Web analytics
5: Empirical methods
Goals of Evaluation
Aims to evaluate:
new existing (benchmark)
i) one new system vs. one existing system
OR
ii) new alternative a vs. new alternative b

1. Assessing system functionality


2. Assessing effect of interface on use
new alt. a new alt. b
3. Identifying specific problems

Source: https://designbuzz.com/another-wmd-for-the-high-schooler-queen-bee-s-kitty/
Types of Evaluation

• A few desktop methods (“no lab”)


– Cognitive walkthrough
– Usability Heuristics
– Review-based evaluation
– …
• Laboratory studies (“lab required”)
– Think aloud
– Collaborative studies
– Interviews and questionnaires
– Physiological methods
– …
• Field studies (“no desktop, no lab”)
– …
Jakob Nielsen’s Ten Usability Heuristics

1. Visibility of system status


2. Match between system and the real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose,
and recover from errors
10. Help and documentation
Video intro, a bit longer than here: https://www.youtube.com/watch?v=_RxfU6dPZuU
UNIVERSITY OF BERGEN

Selection of Method for TUIs


• NASA-TLX
• Eye tracking
• Physiological measurement
• Pleasure-Arousal-Dominance (PAD)
• Hedonic quality; AttrakDiff
• Example 1: UbiSwarm: PAD and AttrakDiff
6: Cognitive aspects:
foundational
knowledge
What is cognition?
– Different kinds of cognition
thinking, remembering, learning, daydreaming, decision making, seeing, reading,
writing, etc

– Norman distinguishes between two general modes:


(1) experiential [fast thinking] cognition
a state of mind in which we perceive, act, and react to events around us intuitively and
effortlessly.

(2) reflective [slow thinking] cognition


involve mental effort, attention, judgment, and conscious decision making.
Cognitive processes
• Attention
• Perception
• Gastalt principles

• Memory and learning


• Reading, speaking and listening
• Problem-solving, planning, reasoning and decision-making
• Mental models: “What the user believes about the system at hand.”

For each category, different design implications and data collection methods are
applicable.
7: Design & prototyping
UNIVERSITY OF BERGEN

But why?
• You can test out ideas for yourself
• It encourages reflection
• Stakeholders can see, hold, interact with a prototype more
easily than a document or a drawing
• To better understand how users will interact with your final
artifact.
• The prototype can reveal errors and omissions in the
requirements.
• Users gain a sense of ownership of the final product.
UNIVERSITY OF BERGEN

Do’s and Dont’s


• Apply heuristics, cognitive • prototype features or
foundational knowledge functionality that cannot be
• Many design iterations implemented.
• Listen to users • prototype review sessions
without clear guidelines for
• Keep track of what you did feedback
• When creating interactive • be perfectionistic
high-fidelity prototypes and • prototype everything. Most
simulations, build in of the time, you shouldn’t
realistic delays have to.
prototyping
Low-fidelity and high fidelity
• Storyboards • Often computer-based, and usually allow realistic user
interactions.
• Sketching • Takes you as close as possible to a true representation of the user
• Card-based (paper) prototypes interface.
• It is much more effective in
• Whiteboards • collecting true human performance data demonstrating actual
• Flip charts products to clients, management, and others.
• Visual design: Realistic and detailed design.
• Post-it notes • Content: Designers use real or similar-to-real content.
• Index cards • Interactivity: Prototypes are highly realistic in their interactions
• Represents the core functionality of the products user interface.
• Wizard-of-Oz • Users can enter data in entry fields, respond to messages, select
• Design tool icon to open windows, interact with user interface, etc.

38
Pro’s and cons
Low-fidelity and high fidelity
Advantages Advantages
– Lower development cost – Complete functionality
– Evaluates multiple design concepts – Fully interactive
– Useful communication device
– User driven
– Addresses screen layout issues
– Use for exploration and test
– Proof of concept
– Look and feel of final product
– Serves as a living specification
Disadvantages
– Limited error checking Disadvantages
– Poor detailed specification to code to – More resource-intensive to develop
– Facilitator driven
– Time-consuming to create
– Limited utility after requirements
established – Inefficient for proof-of-concept design
– Limited usefulness for usability tests – Not effective for requirement gathering
– Navigational and flow limitations

39
UNIVERSITY OF BERGEN

Digital prototyping tools


• http://prototypingtools.co/
• Balsamiq
• Axure
• Figma
• https://www.mockflow.com/
• https://careerfoundry.com/en/blog/ux-design/free-
wireframing-tools/
UNIVERSITY OF BERGEN

Phyisical prototyping
• Get feedback on our design faster; saves money
• Experiment with alternative designs
• Fix problems before code is written
• Keep the design centered on the user
8: Evaluation: From
Framework to
Measurables
Estimating the time of pointing
UNIVERSITY OF BERGEN

Hick’s Law
Other Similar Laws: Steering Law
Touch Sensors and Understanding Touch
Estimating the minimal size of a graphical item
9: Machine Learning
and recommender
systems

By Mehdi Elahai
(Movie info, cast & crew)
(Colors, sounds)
10: VR and AR

By Ilona Heldal
UNIVERSITY OF BERGEN

Natural interaction
• A Natural User Interface (NUI) is often defined as an “effectively
invisible” UI.
• It can be invisible because no instructions should ideally be needed.
• A NUI is an interface that “enables us to interact with a computer in
the same ways we interact with the physical world, through using our
voice, hands and bodies” (Rogers, Sharp, Preece).
– This could be done in VR… Does it mean we should?

• “Which interface is most appropriate, most useful, most efficient, most


supportive, etc., will depend on the interplay of a Number of factors,
including reliability, social acceptability, privacy, ethical and location
concerns” (Rogers, Sharp, Preece).
VR as a bridge between cyberspace and
the physical world.
• Can VR, or AR, fulfill the vision of tangible bits?
• Objects in VR is definitively making bits “virtually”
physical.
• We are, however, fully in cyberspace.
• The vision can inspire to make the virtual physical,
by valuing natural semantics and haptic interaction.
11: Life logging

By Duc Tien Dang


Nguyen
UNIVERSITY OF BERGEN
UNIVERSITY OF BERGEN

• Dodge and Kitchin (2007), refer to lifelogging as “a form of pervasive


computing, consisting of a unified digital record of the totality of an
individual’s experiences, captured multi-modally through digital sensors
and stored permanently as a personal multimedia archive”.
UNIVERSITY OF BERGEN
uib.no

You might also like