Human Computer Interaction Reviewer
Human Computer Interaction Reviewer
Human Computer Interaction Reviewer
Interactive Products
● Smartphone ● Printer
● Tablet ● IPod
● Computer ● GPSe-reader
● Remote control ● TV
● Coffee machine ● Radio
● ATM ● Games console
● Ticket machine
Good and poor design - A good design is effective and efficient, while a bad design is ineffective and
inefficient.
Interaction Design - designing interactive products to support the way people communicate and interact in
their everyday and working lives.
The Components of Interaction Design
● Academic Disciplines ● Product Design
● Human Computer Interactions (HCI) ● Graphic Design
● Information System ● Engineering
● Film Industry ● Human Factors (HF)
● Industrial Design ● Computer Science
● Artist Design
User Experience - The user experience (UX) is central to interaction design. By this it is meant how a product
behaves and is used by people in the real world.
The iPod phenomenon - Apple's classic generations of iPods have been a phenomenal success.
McCarthy and Wright propose four core threads that make up our holistic experiences:
1. The sensual thread - This is concerned with our sensory engagement with a situation and is like the visceral
level of Norman's model
2. The emotional thread - Common examples of emotions that spring to mind are sorrow, anger, joy, and
happiness.
3. The compositional thread - This is concerned with the narrative part of an experience, as it unfolds, and the
way a person makes sense of it.
4. The spatio-temporal thread - This refers to the space and time in which our experiences take place and
their effect upon those experiences.
Design Principles
Design principles are used by interaction designers to aid their thinking when designing for the user
experience.
1. Visibility.
The importance of visibility is exemplified by our contrasting examples at the beginning of the chapter. The
voice mail system made the presence and number of waiting messages invisible, while the answer machine
made both aspects highly visible.
2. Feedback.
Related to the concept of visibility is feedback. This is best illustrated by an analogy to what everyday life
would be like without it.
3. Constraints.
The design concept of constraining refers to determining ways of restricting the kinds of user interaction that
can take place at a given moment.
4. Consistency.
This refers to designing interfaces to have similar operations and use similar elements for achieving similar
tasks. A consistent interface is one that follows rules, such as using the same operation to select all objects.
5. Affordance.
This is a term used to refer to an attribute of an object that allows people to know how to use it.
Experiential - a state of mind in which we perceive, act, and react to events around us intuitively and
effortlessly. It requires reaching a certain level of expertise and engagement.
Reflective - involve mental effort, attention, judgment, and decision making. This kind of cognition is what
leads to new ideas and creativity
Attention - This is the process of selecting things to concentrate on, at a point in time, from the range of
possibilities available. Attention involves our auditory and/or visual senses.
Information Presentation - The way information is displayed can also greatly influence how easy or difficult it
is to attend to appropriate pieces of information.
Multitasking Attention - Many of us now spend a large proportion of our time staring at a screen, be it a
smartphone, laptop, TV, or tablet.
Perception - refers to how information is acquired from the environment via the different sense organs – eyes,
ears, fingers, and transformed into experiences of objects, events, sounds, and tastes (Roth, 1986).
Memory - involves recalling various kinds of knowledge that allow us to act appropriately.
Learning - GUIs and direct manipulation interfaces are good environments for supporting this kind of active
learning.
Reading, speaking, and listening - are three forms of language processing that have similar and different
properties.
Problem solving, planning, reasoning, and decision making - are processes involving reflective cognition. They
include thinking about what to do, what the options are, and what the consequences might be of carrying out
a given action.
Cognitive Framework - three early internal frameworks that focus primarily on mental processes together
with three more recent external ones that explain how humans interact and use technologies in the context in
which they occur.
1. INTERNAL - Mental Models, Gulfs of Execution and evaluating, &Information processing
1. Being Social - fundamental aspect of everyday life is being social – interacting with each other. We
continuously update each other about news, changes, and developments on a given project, activity, person,
or event.
2 types of conversation
1. Face to Face conversations - when two or more people interact and communicate while visible to
one another.
2. Remote Conversations - is a way of communicating with others online.
IMPLICIT OR EXPLICIT CUES
1. IMPLICIT - Signaling indirectly to the other participants that he wants the conversation to draw to a close.
2. EXPLICIT - Direct cues, these types of cues are specific and clear.
1. Telepresence - allows real-time, two-way collaboration between people who are not in the same location
2. Co-presence - A communication dimension that refers to participants in a communication being located in
the same physical setting.
3. Physical Coordination - When people are working closely together, they talk to each other, issuing
commands and letting others know how they are progressing.
4. Awareness - Invqolves knowing who is around, what is happening, and who is talking with whom.
4.1 Peripheral awareness - Keeping an eye on things happening in the periphery of vision
4.2 Overhearing and overseeing – allows tracking of what others are doing without explicit cues.
5. Shareable Interfaces - Several studies have been carried out investigating whether different arrangements
of shared technologies can help co-located people work together better.
-----------------------Report ni Sir-----------------------
Gestalt Principles define some basic laws that help us understand how the human mind perceives visual
stimuli.
fundamental principle of perceptual grouping is the law of Prägnanz (also known as the law of good Gestalt).
law of Prägnanz says that we tend to experience things as:
1. regular
2. orderly
3. symmetrical
4. simple
Of several geometrically possible organizations that one will occur which possesses the best, simplest and
most stable shape. — Kurt Koffka
The principles were based on similarity, proximity, and continuity.
1. Proximity - affirms that we perceive elements that are closer to each other to belong in the same group
2. Similarity - explores the fact that similar elements are perceived as being part of the same group and
having the same function.
3. Continuity - elements positioned in a line (or curve) are perceived as a continuation, a sequence of facts
arranged in an order, or a follow-up of the previous element.
4. Closure - The human brain automatically fills in the gaps that don’t exist. This Gestalt principle states that
we use memory to convert complex objects into simpler or known shapes.
5. Figure-ground - our perception instinctively perceives objects as either being in the foreground or the
background.
6. Common region - The common region principle is related to proximity. This principle states that when
objects are positioned within the same closed region, they are perceived as part of the same group
7. Focal points - The focal point law states that any element that stands out visually captures and holds the
viewer’s attention.
2. Expressive Interfaces
Expressive forms like emoticons, sounds, icons, and virtual agents have been used at the interface
to convey emotional states and/or elicit certain kinds of emotional responses in users, such as feeling
at ease, comfort, and happiness.
Other ways of conveying the status of a system are through the use of:
Dynamic icons (e.g., a recycle bin expanding when a file is placed in it and paper disappearing in a puff when
emptied).
Spoken messages, using various kinds of voices, telling the user what needs to be done (e.g., GPS navigation
system instructing you politely where to go after having taken a wrong turn).
Various sonification’s indicating actions and events (e.g., whoosh for window closing, schlook for a file being
dragged, ding for new email arriving).
Vibrotactile feedback, such as distinct smartphone buzzes that specifically represent special messages from
friends and family.
3. Annoying Interfaces
computer interfaces may inadvertently elicit negative emotional responses such as anger and disgust.
Interfaces, if designed poorly, can make people look stupid, or feel insulted or threatened. The effect
can be to make them annoyed to the point of losing their temper.
Chapter 6 Interfaces
Interface type
Interface Types
Numerous adjectives have been used to describe the different kinds of interfaces that have been developed,
including graphical, command, speech, multimodal, invisible, ambient, affective, mobile, intelligent, adaptive,
smart, tangible, touchless, and natural.
1. Command-Based
Early interfaces required the user to type in commands that were typically abbreviations at the prompt symbol
appearing on the computer display, which the system responded to. Another way of issuing commands is
through pressing certain combinations of keys (e.g., Shift+Alt+Ctrl)
Window design. Windows were invented to overcome the physical constraints of a computer display, enabling
more information to be viewed and tasks to be performed at the same screen.
Menu design. Just like restaurant menus, interface menus offer users a structured way of choosing from the
available set of options. Headings are used as part of the menu to make it easier for the user to scan through
them and find what they want.
Interface menu designs have employed similar methods of categorizing and illustrating options available that
have been adapted to the medium of the GUI. A difference is that interface menus are typically ordered across
the top row or down the side of a screen using category headers as part of a menu bar.
Icons can be designed to represent objects and operations at the interface using concrete objects and/or
abstract symbols.
3. Multimedia - as the name implies, combines different media within a single interface, namely, graphics, text, video,
sound, and animations, and links them with various forms of interactivity.
4. Virtual Reality - Virtual reality (VR) uses computer-generated graphical simulations to create “the illusion of
participation in a synthetic environment rather than external observation of such an environment” (Gigante, 1993,
p. 3). VR is a generic term that refers to the experience of interacting with an artificial environment, which makes it
feel virtually real.
5. Web Early websites were largely text-based, providing hyperlinks to different places or pages of text. Much of the
design effort was concerned with how best to structure information at the interface to enable users to navigate and
access it easily and quickly
6. Mobile - Mobile devices have become pervasive, with people increasingly using them in all aspects of their everyday
and working lives.
7. Speech - A speech or voice user interface is where a person talks with a system that has a spoken language
application, like a train timetable, a travel planner, or a phone service.
8. Pen - Pen-based devices enable people to write, draw, select, and move objects at an interface using lightpens or
styluses that capitalize on the well-honed drawing and writing skills that are developed from childhood.
9. Touch screens - such as walk-up kiosks (e.g., ticket machines, museum guides), ATMs, and till machines (e.g.,
restaurants), have been around for some time.
10. Air-Based Gestures - Camera capture, sensor, and computer vision techniques have advanced such that it is now
possible to accurately recognize people's body, arm, and hand gestures in a room.
11. Haptic interfaces - provide tactile feedback, by applying vibration and forces to the person, using actuators that are
embedded in their clothing or a device they are carrying, such as a smartphone or smartwatch.
12. Multimodal interfaces - are intended to provide enriched and complex user experiences by multiplying the way
information is experienced and controlled at the interface through using different modalities, i.e., touch, sight,
sound, speech.
13. Shareable interfaces - are designed for more than one person to use. Unlike PCs, laptops, and mobile devices – that
are aimed at single users – they typically provide multiple inputs and sometimes allow simultaneous input by
collocated groups.
14. Tangible interfaces - use sensor-based interaction, where physical objects, e.g., bricks, balls, and cubes, are coupled
with digital representations.
15. Augmented and Mixed Reality - Other ways that the physical and digital worlds have been bridged include
augmented reality, where virtual representations are superimposed on physical devices and objects, and mixed
reality, where views of the real world are combined with views of a virtual environment.
16. Wearables - Imagine being at a party and being able to access the Facebook of a person whom you have just met,
while or after talking to her, to find out more about her.
17. Robots and Drones - Robots have been with us for some time, most notably as characters in science fiction movies,
but also playing an important role as part of manufacturing assembly lines, as remote investigators of hazardous
locations (e.g., nuclear power stations and bomb disposal), and as search and rescue helpers in disasters (e.g., fires)
or far-away places (e.g., Mars).
18. Brain–Computer Interfaces Brain–computer interfaces (BCI) provide a communication pathway between a person's
brain waves and an external device, such as a cursor on a screen or a tangible puck that moves via airflow).
Natural User Interface (NUI) is a system for human-computer interaction that the user operates through intuitive
actions related to natural, everyday human behavior.
Kinds of Prototyping
a. Low Fidelity
b. High Fidelity
Low Fidelity
• Examples:
▪ ‘Post-it’ notes
▪ Storyboards
Storyboard
• It is a series of sketches showing how a user might progress through a task using the product
• Often used with scenarios, bringing in more detail and a chance to role play
Sketching
High Fidelity
• Prototype looks more like the final system than a low fidelity version
• High-fidelity prototypes can be developed by integrating existing hardware and software components
Conceptual model - an outline of what people can do with a product and what concepts are needed to understand and
interact with it
Generating Prototypes
1. Wheel
2. Timeline
1. Arduino
2. Lilypad(for fabrics)
3. Sense board
Software Development Kits (SDKs) – programming tools and components to develop for a specific platform, for example
iOS. Makes development much easier
Includes:
IDE, Documentation, Drivers, sample code, and Application Programming Interfaces (API’s)
Example: