Plagiarism Checker X Originality Report: Similarity Found: 24%
Plagiarism Checker X Originality Report: Similarity Found: 24%
Plagiarism Checker X Originality Report: Similarity Found: 24%
User Interface Technology in homes Prepared for Prof. Dr. Sherif Hafez Faculty of
Engineering, Alexandria University Prepared by Ibrahim Hani Ibrahim ID: 5158 January 2,
2021 Table of Contents List of Figures 3 Acronyms 4 Abstract: 5 Introduction: 5 History
of UI: 5 UI and UX 7 What is user experience (UX) design? 7 How does UI differ from UX?
8 Different types of UI: 9 Command Line Interface (CLI) - 9 Menu Driven Interface (MDI)-
10 Examples of MDI: 10 Graphical User Interface (GUI) 10 Touchscreen GUI- 11 Example
of GUI: 11 Technologies using UI: 11 Interactive visualization 11 Gesture Recognition
and Motion Sensing 13 Eye tracking and facial recognition 13 Virtual laser keyboards 14
Human Robot Interface: 15 Examples of UI technology in Homes: 16 Best Smart Home
Devices which use U I: 19 Conclusion 19 References 20 Arabic summary: 21 Plagiarism
Checker: 24 List of Figures Figure 1 shows the early Xerox Parc innovations that lead to
user interface later 5 Figure 2 shows the difference between UI and UX 8 Figure 3 shows
the Command line interface 9 Figure 4 shows a Kiosk 10 Figure 5 shows an ATM 10
Figure 6 shows an example of GUI 10 Figure 7 shows touchscreen GUI used in hospitals
11 Figure 8 shows how hospitals use UI to achieve interactive visualization 12 Figure 9
shows gesture recognition used at home 13 Figure 10 shows eye tracking used at home
14 Figure 11 shows a laser keyboard used at home 14 Figure 12 shows a robot helping a
person at home shave 15 Figure 13 shows Google Nest Hub Max 16 Figure 14 shows
Google smart control 16 Figure 15 shows Example of monitoring screen 17 Figure 16
shows an example of smart lamp 17 Figure 17 shows Aros Smart Air Conditioner Figure
18 showsThe Blossom Smart Watering Controller 17 Figure 19 shows examples of smart
doors 18 Figure 20 shows Voice Pods 18 Figure 21 shows examples of smart remotes 18
Acronyms UI _User Interface _ _UX _user experience _ _CLI _Command Line Interface _
_MDI _Menu Driven Interface _ _GUI _Graphical User Interface _ _US _United States _
_HTLM _Hypertext Markup Language _ _PC _Personal Computer _ _IT _Information
Technology _ _MEMS _Micro-electromechanical systems _ _CPU _central processing unit
_ _3D _three dimensions _ _ Abstract: This research paper aims to explicitly define the
user interface and explain its different types.
As well as demonstrate its broad uses in different fields. Moreover, this paper show
different uses of UI in homes. Introduction: The user interface (UI) is the point of
human-computer interaction and communication in a device. This can incorporate
presentation screens, consoles, a mouse and the presence of a work area. It is
additionally the path through which a client interfaces with an application or a site. The
developing reliance of numerous organizations on web applications and portable
applications has driven numerous organizations to put expanded need on UI to improve
the client's general insight..
Customers around the globe are presently ready to build up and look after association,
paying little heed to area, through UI. Combined with a wide assortment of utilizations
and cloud administration contributions, this has produced new roads through which
buyers access content and has additionally set off movements in shopper conduct. For
example, with shoppers currently utilizing various keen electronic gadgets for diversion
(e.g., video web based), the extent of purchasers sitting in front of the TV broadcasts
and link has dropped .
History of UI: UIs have been around since PCs have existed, even well before the field of
Human-Computer Interaction was set up. Throughout the long term, a few papers on
the historical backdrop of Human-Computer Interaction and User Interfaces have
showed up, essentially zeroing in on the graphical interface time and early visionaries,
for example, Bush, Engelbart and Kay. In the most recent many years, antiquarians of
innovation have expounded widely on the historical backdrop of PCs and figuring. These
works supplement the compositions of pioneers of figuring, for example, Herbert
Goldstein and Maurice Wilkes.
Throughout the long term, a few papers on the historical backdrop of Human-Computer
Interaction have showed up and just as a book on the historical backdrop of the
neighboring field Human Factors and Ergonomics. Little work has been distributed on
the historical backdrop of User Interfaces everywhere with one outstanding special case:
the historical backdrop of the Graphical User Interface, remembering the endeavors at
Xerox PARC for the 1970s and 1980s and a few heralds.
Writers incorporate not just network individuals, history specialists of innovation, and
innovation essayists, yet in addition scientists in Media and Cultural Studies as the
Graphical User Interface has gotten profoundly installed in our way of life. These
methodologies basically center around the graphical interface period with its archetypes
and early visionaries, for example, Vanover Bush, Doug Engelbart, Ivan Sutherland, Ted
Nelson, and Alan Kay. Despite the tremendous impact of these pioneers, the time has
come to illustrate the History of the User Interface.
There are other reasons for embarking on the study of the History of User Interfaces:
The general degree of information about the historical backdrop of UIs appears to be
restricted. Undoubtedly, a casual overview among understudies at the IT University of
Copenhagen re. the historical backdrop of PCs and interfaces proposes that their set of
experiences collectively begins with the pc and the graphical UI The meaning of the
term user interface has changed considerably as seen in Some design issues in today’s
“baby” interfaces in information appliances resemble those of the past with small,
character-based screens. It seems obvious to utilize earlier experiences. The
advancement of interfaces is by all accounts considered as a solid succession of
upgrades.
In any case, in spite of their expert relationship, the jobs themselves are very unique,
alluding to totally different parts of the item improvement measure and the plan
discipline. What is user experience (UX) design? User experience configuration is a
human-first method of planning items. Wear Norman, an intellectual researcher, and
prime supporter of the Nielsen Norman Group Design Consultancy is credited with
authoring the expression "client experience" in the last part of the 1990s.
Here is the manner by which he portrays it: "Client experience incorporates all parts of
the end-client's connection with the organization, its administrations, and its items".
Notwithstanding its medium, UX Design envelops all cooperations between a potential
or dynamic client and an organization. As a logical cycle it very well may be applied to
anything, streetlights, vehicles, Ikea racking, etc. Notwithstanding, in spite of being a
logical term, its utilization since commencement has been for the most part inside
advanced fields; one purpose behind this being that the tech business fired exploding
around the hour of the term's creation.
Corresponding to sites and applications, UI configuration thinks about the look, feel,
and intuitiveness of the item. It is tied in with ensuring that the UI of an item is as
instinctive as could reasonably be expected, and that implies cautiously considering
every single visual, intuitive component the client may experience. A UI planner will
consider symbols and catches, typography, and shading plans, dispersing, symbolism,
and responsive plan. UX design is all about the overall feel of the experience, while UI
design is all about how the product’s interfaces look and function.
A UX designer considers the user’s entire journey to solve a particular problem; what
steps do they take? What tasks do they need to complete? How straightforward is the
experience? Much of their work focuses on finding out what kinds of problems and
pain-points users come up against, and how a certain product might solve them. They’ll
conduct extensive user research in order to find out who the target users are and what
their needs are in relation to a certain product. They’ll then map out the user’s journey
across a product, considering things like information architecture—i.e. How the content
is organized and labelled across a product—and what kinds of features the user might
need.
In the long run, they will make wireframes which set out the no frills plans for the item.
With the skeleton of the item outlined, the UI architect steps in to rejuvenate it. The UI
originator thinks about all the visual parts of the client's excursion, including all the
individual screens and touchpoints that the client may experience; think tapping a catch,
looking down a page, or swiping through a picture exhibition. While the UX originator
outlines the excursion, the UI fashioner centers around all the subtleties that make this
excursion conceivable.
This shouldn't imply that that UI configuration is about looks; UI creators hugy affect
whether an item is available and comprehensive. They will pose inquiries like "By what
method can distinctive shading mixes be utilized to make differentiation and upgrade
lucidness?" or "What shading pairings oblige visual impairment?" Different types of UI:
Command Line Interface (CLI) - A command line interface (CLI) is a text-based user
interface (UI) used to view and manage computer files. Command line interfaces are also
called command-line user interfaces, console user interfaces and character user
interfaces.
Before the mouse, users interacted with an operating system (OS) or application with a
keyboard. Users typed commands in the command line interface to run tasks on a
computer. Typically, the command line interface features a black box with white text.
The user responds to a prompt in the command line interface by typing a command.
The output or response from the system can include a message, table, list, or some
other confirmation of a system or application action. Menu Driven Interface (MDI)- A
menu-driven interface is, simply, an easier way of navigating the devices and programs
we interact with daily. It employs a series of screens, or ''menus,'' that allow users to
make choices about what to do next.
A menu-driven interface can use a list format or graphics, with one selection leading to
the next menu screen, until the user has completed the desired outcome. Examples of
MDI: Graphical User Interface (GUI) - A GUI is a system of interactive visual components
for computer software. A GUI displays objects that convey information and represent
actions that can be taken by the user. The objects change color, size, or visibility when
the user interacts with them. GUI objects include icons, cursors, and buttons. These
graphical elements are sometimes enhanced with sounds, or visual effects like
transparency and drop shadows.
business analytics solution. This allows users to access business data modelling and
interact freely with the data without being reliant on IT departments and vendors11. This
represents a shift toward more user-centric data analysis capabilities from traditional
IT-centric and report-centric approaches. There are also more solutions that incorporate
interactive visualization, namely Adobe Flash, Microsoft Silverlight, Ajax, HTML5 and
other Web 2.0 technologies that enable animated, interactive displays of data.
These examples illustrate the insights that users can obtain by interacting with the visual
representation of data. Gesture Recognition and Motion Sensing Gesture recognition
involves determining the movement of a user’s fingers, hands, arms, head or body in
three dimensions through the use of a camera or a device with embedded sensors that
may be worn, held or body mounted. With Microsoft making the Software Development
Kit for the commercial version of Kinect free, developers will be encouraged to innovate
and create apps in multiple areas, including business, healthcare, and gaming.
In the short term, elementary gesture recognition technology such as the manipulation
and retrieval of data on smart and mobile devices, will see mainstream adoption. In
addition, motion detection capabilities are already on smart devices today.
Micro-electromechanical systems (MEMS) enable a multitude of applications from
motion detection to navigation. Mobile phones today already have MEMS sensors such
as accelerometers (rotation of phone display according to user movement), gyroscopes
(maintain or measure orientation), barometers and altimeters (measure atmospheric
pressure) and magnetometers (provide compass direction functionality). Eye tracking
and facial recognition Eye tracking technologies determine the angle or position of a
user's visual attention, typically using cameras.
The different types of non-intrusive eye trackers generally include two common
components: a light source and a camera. The light source (usually infrared) is directed
toward the eye while the camera tracks the reflection of the light source. The data
obtained is used to extrapolate the rotation of the eye and derive the direction of gaze.
Eye tracking can provide the following forms of data for analysis. Scan paths allow
researchers to analyze the flow of attention throughout a design. The scan paths show
eye fixations and the order in which they occur. This provides insights to feature visibility
and the visual impact on customers. Sequential videos can also show scan path
progression in real time.
Heat maps display where users concentrate attention within a given design, revealing
which areas people focus on, as well as areas that are commonly ignored. Virtual laser
keyboards Virtual laser projectors are small projector modules that displays user
interface (e.g., keyboard and mouse) onto any flat surface so that the users interact
more easily with mobile devices or systems without the need for a physical keyboard or
mouse.
The laser projectors synchronise with mobile devices via Bluetooth and create a virtual
keyboard almost the size of any standard physical keyboard on any surface. This gives
users the freedom of space to type and the convenience of not hauling around a
keyboard. Keystrokes on the virtual keypad projection send signals to the corresponding
mobile device via Bluetooth. The keyboard can even have the functionality of a mouse
which moves the cursor on the device as the fingertip moves on the laser surface.
This approach will allow robots to offload computationally intensive tasks like image
processing and voice recognition to the cloud and even download new skills instantly.
By offloading heavy computing tasks to the cloud, hardware will be easier to maintain
and Central Processing Unit (CPU) hardware upgrades will be hassle-free. Robots will
have longer battery life and have less need for software pushes and updates. Smart
home solutions are designed to deliver comfort and convenience. Apart from controlling
devices on smart panels, you can control everything on your smartphone or with your
facial expressions or even with your motions.
Examples of UI technology in Homes: Smart homes are becoming more and more
popular. Just by looking back at the past decades, we can see how the interaction
between people and smart devices has changed drastically from early computers with
text-based commands to graphical user interfaces, all the way to mobile and touch
devices. And the evolution of technology does not end here as today we’re experiencing
the integration of services into our homes controlled with our voice. So let’s see. No
need to panic in case you forgot to lock the door, you can now do it through an app.
You can also check on your pet through a smart cam to make sure everything is okay.
You can basically have control over almost everything you own. Play or pause videos
without touching the device / Figure 13 shows Google Nest Hub Max Turn off or on the
lights / Figure 14 shows Google smart control Monitor every single meter in your home
/ Figure 15 shows Example of monitoring screen Smart Lighting Solutions / Figure 16
shows an example of smart lamp Smart Home Utilities based on UI // Figure 17 shows
Aros Smart Air Conditioner Figure 18 showsThe Blossom Smart Watering Controller
Door Locks // Figure 19 shows examples of smart doors Smart Voice Recognition and
Voice Activated Products // Figure 20 shows Voice Pods Smart Remote Controls //
Figure 21 shows examples of smart remotes Best Smart Home Devices which use U I:
Amazon Echo Philips Hue TP-Link HS200 Ecobee4 NetGear Arlo Q Char-Broil Digital
Electric Smoker with Smart Chef Technology Perfect Bake Pro Ecovacs Deebot N79S LG
Smart TV Conclusion: User interface technology is a widespread helpful source that
makes our lives easier every day.
There are different types of user interface including CLI, MDI, GUI, and touchscreen GUI.
These are all implemented in our day-to-day activities and homes. The paper also
explicitly discussed interactive visualization, gesture recognition, motion sensing and
eye tracking. Finally, the research paper touched upon virtual laser keyboards and
human robot interfaces as upcoming technology in the UI field. References:
https://www.researchgate.net/publication/227998407_User_Interface_Design
https://www.researchgate.net/publication/317660257_User_Interface_and_User_Experien
ce_UIUX_Design
https://www.researchgate.net/publication/327673648_A_study_on_understanding_of_UI_
and_UX_and_understanding_of_design_according_to_user_interface_change
https://www.researchgate.net/publication/261247756_A_study_of_User_Interface_Design
_principles_and_requirements_for_developing_a_Mobile_learning_prototype
https://www.researchgate.net/publication/327898655_Analysis_of_User_Interface_and_U
ser_Experience_on_Comrades_Application
INTERNET SOURCES:
-------------------------------------------------------------------------------------------
<1% -
https://www.slideshare.net/nurshamimahsamsuddin/group-assignment-health-industry-
tourism
<1% - https://en.wikipedia.org/wiki/User_interface
<1% -
https://www.interaction-design.org/literature/topics/human-computer-interaction
<1% - https://www.researchgate.net/publication/221515200_User_interface_history
<1% - http://nostalgicfutures.gvu.gatech.edu/
<1% - https://www.slideshare.net/figentas/case-incident-2
4% -
https://careerfoundry.com/en/blog/ux-design/the-difference-between-ux-and-ui-design
-a-laymans-guide/
<1% - https://www.beyondthebookcast.com/wp-images/NielsonNormanTranscript.pdf
<1% - https://www.newsblogtips.com/2020/02/UX-design.html
1% - https://honchous.com/ui-ux-design/
<1% - https://www.orbitmedia.com/blog/7-reasons-to-wireframe/
3% -
https://searchwindowsserver.techtarget.com/definition/command-line-interface-CLI
1% - https://www.coursehero.com/file/62724697/Week-8-Discussiondocx/
1% -
https://study.com/academy/lesson/menu-driven-interface-definition-examples.html
2% - https://www.computerhope.com/jargon/g/gui.htm
1% - https://www.altia.com/category/medical-2/
1% - https://www.altia.com/2014/08/29/gui-reshaping-modern-medicine/
1% -
https://www.gartner.com/en/information-technology/glossary/interactive-visualization
4% - https://www.slideshare.net/carloshuertasperez5/user-interface-42201296
<1% -
https://www.techopedia.com/2/31007/trends/virtualization/10-ways-virtualization-can-i
mprove-security
<1% - https://www.engineersgarage.com/article_page/gesture-recognition-technology/
<1% -
https://www.crn.com.au/news/microsoft-to-offer-kinect-sdk-for-businesses-278560
<1% - http://e-university.tu-sofia.bg/e-publ/files/1165_2_F04.pdf
1% -
https://www.click.co.uk/blog/link-building-infographics-and-eye-tracking-a-multipurpos
e-solution/
<1% - https://coolstuffjapan.com/bluetooth-projection-laser-keyboard
1% -
https://spectrum.ieee.org/automaton/robotics/robotics-hardware/robotics-trends-for-2
012
<1% -
https://www.slideshare.net/chhattanshah/keynote-on-mobile-grid-and-cloud-computin
g
<1% - https://www.the-ambient.com/how-to/get-started-with-the-smart-home-199
<1% - https://www.digiteum.com/smart-home-trends/
<1% - https://amiteche.com/the-evolution-of-technology-we-are-the-future/
<1% - https://mashable.com/roundup/best-pet-cameras/
<1% -
https://www.theverge.com/2019/5/7/18529318/google-nest-hub-max-smart-display-ca
mera-price-features-specs-release-hands-on-io-2019