Multimedia
Multimedia
Multimedia
BASICS OF MULTIMEDIA
Multimedia is a media that uses multiple form of information content and information processing.
In any business enterprise, multimedia exists in the form of advertisements, presentations, video
conferencing, voice mail, etc.
Multimedia is the field concerned with the computer-controlled integration of text, graphics,
drawings, still and moving images (Video), animation, audio, and any other media where every
type of information can be represented, stored, transmitted and processed digitally.
By definition Multimedia is a representation of information in an attractive and interactive manner
with the use of a combination of text, audio, video, graphics and animation. In other words, we
can say that Multimedia is a computerized method of presenting information combining textual
data, audio, visuals (video), graphics and animations. For examples: E-Mail, Yahoo Messenger,
Video Conferencing, and Multimedia Message Service (MMS).
Multimedia as name suggests is the combination of Multi and Media that is many types of media
(hardware/software) used for communication of information.
Multimedia Components(Elements)
Text: It contains alphanumeric and some other special characters. Keyboard is usually used for
input of text; however, there are some internal (inbuilt) features to include such text.
Graphics: This technology generate, represent, process, manipulate, and display pictures. It is one
of the most important components of multimedia application. The development of graphics is
supported by a different software. Graphics make the multimedia application attractive. In many
cases people do not like reading large amount of textual matter on the screen.
There are two types of Graphics:
Bitmap images- Bitmap images are real images that can be captured from devices such as
digital cameras or scanners. Generally, bitmap images are not editable. Bitmap images
require a large amount of memory.
Vector Graphics- Vector graphics are drawn on the computer and only require a small
amount of memory. These graphics are editable.
Audio: A multimedia application may require the use of speech, music and sound effects. These
are called audio or sound element of multimedia. This technology records, synthesizes, and plays
audio (sound). There are many learning courses and different instructions that can be delivered
through this medium appropriately.
Video: The term video refers to the moving picture, accompanied by sound such as a picture in
television. Video element of multimedia application gives a lot of information in small duration of
time. Digital video is useful in multimedia application for showing real life objects. Video have
highest performance demand on the computer memory and on the bandwidth if placed on the
internet. This technology records, synthesizes, and displays images (known as frames) in such
sequences (at a fixed speed) that makes the creation appear as moving; this is how we see a
completely developed video. In order to watch a video without any interruption, video device must
display 24 to 30 frames/second.
1
COM415
Animation: Animation is a process of making a static image look like it is moving. An animation
is just a continuous series of still images that are displayed in a sequence. The animation can be
used effectively for attracting attention. Animation also makes a presentation light and attractive.
Computer animation is a modern technology, which helps in creating, developing, sequencing, and
displaying a set of images (technically known as ‘frames’). Animation gives visual effects or
motion very similar to that of a video file.
Multimedia Advantages
Increases learning effectiveness
Gains and holds attention
More appealing
Reduces training cost
Easy to use
Give information to individuals
Provides high quality of presentations
Multi-sensorial
Integrated and interactive
Can be used as a wide variety of audience
Entertaining and educational
Multimedia Disadvantages
Multimedia requires high-end computer systems
Very Expensive
Not always ready to configure
Requires special hardware
Not always compatible
Takes time to compile
Information overload
Misuse and/or overuse
Limitations of technology
APPLICATIONS OF MULTIMEDIA
Following are the common areas of applications of multimedia.
Multimedia in Business- Multimedia can be used in many applications in a business. The
multimedia technology along with communication technology has opened the door for
information of global wok groups. Today the team members may be working anywhere
and can work for various companies. Thus the work place will become global. The
multimedia network should support the following facilities:
o Voice Mail
o Electronic Mail
o Multimedia based FAX
o Office Needs
o Employee Training
2
COM415
o Sales and Other types of Group Presentation
o Records Management
Multimedia in Marketing and Advertising- By using multimedia marketing of new
products can be greatly enhanced. Multimedia boost communication on an affordable cost
opened the way for the marketing and advertising personnel. Presentation that have flying
banners, video transitions, animations, and sound effects are some of the elements used in
composing a multimedia based advertisement to appeal to the consumer in a way never
used before and promote the sale of the products.
Multimedia in Entertainment-
Multimedia in Education- Many computer games with focus on education are now
available. Consider an example of an educational game which plays various rhymes for
kids. The child can paint the pictures; increase reduce size of various objects etc apart
from just playing the rhymes. Several other multimedia packages are available in the
market which provide a lot of detailed information and playing capabilities to kids.
Multimedia in Bank- Bank is another public place where multimedia is finding more and
more application in recent times. People go to bank to open saving/current accounts,
deposit funds, withdraw money, know various financial schemes of the bank, obtain loans
etc. Every bank has a lot of information which it wants to impart to in customers. For this
purpose, it can use multimedia in many ways. Bank also displays information about its
various schemes on a PC monitor placed in the rest area for customers. Today on-line and
internet banking have become very popular. These use multimedia extensively.
Multimedia is thus helping banks give service to their customers and also in educating
them about banks attractive finance schemes.
Multimedia in Hospital- Multimedia best use in hospitals is for real time monitoring of
conditions of patients in critical illness or accident. The conditions are displayed
continuously on a computer screen and can alert the doctor/nurse on duty if any changes
are observed on the screen. Multimedia makes it possible to consult a surgeon or an expert
who can watch an ongoing surgery line on his PC monitor and give online advice at any
crucial juncture.
In hospitals multimedia can also be used to diagnose an illness with CD-ROMs/ Cassettes/
DVDs full of multimedia based information about various diseases and their treatment.
Some hospitals extensively use multimedia presentations in training their junior staff of
doctors and nurses. Multimedia displays are now extensively used during critical
surgeries.
Multimedia Pedagogues- Pedagogues are useful teaching aids only if they stimulate and
motivate the students. The audio-visual support to a pedagogue can actually help in doing
so. A multimedia tutor can provide multiple numbers of challenges to the student to
stimulate his interest in a topic. The instruction provided by pedagogue have moved
beyond providing only button level control to intelligent simulations, dynamic creation of
links, composition and collaboration and system testing of the user interactions.
3
COM415
Communication Technology and Multimedia Services- The advancement of high
computing abilities, communication ways and relevant standards has started the beginning
of an era where you will be provided with multimedia facilities at home. These services
may include:
o Basic Television Services
o Interactive entertainment
o Digital Audio
o Video on demand
o Home shopping
o Financial Transactions
o Interactive multiplayer or single player games
o Digital multimedia libraries
o E-Newspapers, e-magazines
Multimedia Hardware
Most of the computers now-a-days come equipped with the hardware components required to
develop/view multimedia applications. Following are the various categories in which we can
define the various types of hardwares required for multimedia applications.
Processor: The heart of any multimedia computer is its processor. Today Core 15 or
higher processor is recommended for a multimedia computer.
o CPU is considered as the brain of the computer.
o CPU performs all types of data processing operations.
o It stores data, intermediate result and instructions (program).
o It controls the operations of all parts of computer.
Memory and Storage Devices - You need memory for storing various files used during
production, original audio and video clips, edited pieces and final mined pieces. You also
need memory for backup of your project files.
o Primary Memory- Primary memory holds only those data and instructions on
which computer is currently working. It has limited capacity and data gets lost
when power is switched off. It is generally made up of semiconductor device.
These memories are not as fast as registers. The data and instructions required to
be processed earlier reside in main memory. It is divided into two subcategories
RAM and ROM.
o Flash Memory- Cache memory is a very high speed semiconductor memory,
which can speed up CPU. It acts as a buffer between the CPU and main memory.
It is used to hold those parts of data and program which are most frequently used
by CPU. The parts of data and programs are transferred from disk to cache memory
by operating system, from where CPU can access them.
4
COM415
o Secondary Memory: This type of memory is also known as external memory or
non-volatile. It is slower than main memory. These are used for storing
Data/Information permanently. CPU directly does not access these memories;
instead they are accessed via input-output routines. Contents of secondary
memories are first transferred to main memory and then CPU can access it. For
example, disk, CD-ROM, DVD, etc.
Input Devices - Following are the various types of input devices which are used in
multimedia systems.
o Keyboard- Most common and very popular input device is keyboard. The
keyboard helps in inputting the data to the computer. The layout of the keyboard
is like that of traditional typewriter, although there are some additional keys
provided for performing some additional functions. Keyboards are of two sizes 84
keys or 101/102 keys, but now 104 keys or 108 keys keyboard is also available for
Windows and Internet. The keys are following:
These keys include the letter keys (A-Z) and digits keys (0-9)
1 Typing Keys
which generally give same layout as that of typewriters.
The twelve functions keys are present on the keyboard. These are
Function
3 arranged in a row along the top of the keyboard. Each function
Keys
key has unique meaning and is used for some specific purpose.
Special Keyboard also contains some special purpose keys such as Enter,
5
Purpose Keys Shift, Caps Lock, Num Lock, Space bar, Tab, and Print Screen.
5
COM415
o Joystick - Joystick is also a pointing device, which is used to move cursor position
on a monitor screen. It is a stick having a spherical ball at its both lower and upper
ends. The lower spherical ball moves in a socket. The joystick can be moved in all
four directions. The function of joystick is similar to that of a mouse. It is mainly
used in Computer Aided Designing (CAD) and playing computer games.
o Light Pen - Light pen is a pointing device, which is similar to a pen. It is used to
select a displayed menu item or draw pictures on the monitor screen. It consists of
a photocell and an optical system placed in a small tube. When light pen's tip is
moved over the monitor screen and pen button is pressed, its photocell sensing
element detects the screen location and sends the corresponding signal to the CPU.
o Track Ball - Track ball is an input device that is mostly used in notebook or laptop
computer, instead of a mouse. This is a ball, which is half inserted and by moving
6
COM415
fingers on ball, pointer can be moved.Since the whole device is not moved, a track
ball requires less space than a mouse. A track ball comes in various shapes like a
ball, a button and a square.
o Magnetic Ink Card Reader (MICR) - MICR input device is generally used in
banks because of a large number of cheques to be processed everyday. The bank's
code number and cheque number are printed on the cheques with a special type of
ink that contains particles of magnetic material that are machine readable. This
reading process is called Magnetic Ink Character Recognition (MICR). The main
advantage of MICR is that it is fast and less error prone.
7
COM415
o Optical Character Reader (OCR) - OCR is an input device used to read a printed
text. OCR scans text optically character by character, converts them into a machine
readable code and stores the text on the system memory.
o Bar Code Readers - Bar Code Reader is a device used for reading bar coded data
(data in form of light and dark lines). Bar coded data is generally used in labelling
goods, numbering the books, etc. It may be a hand-held scanner or may be
embedded in a stationary scanner.Bar Code Reader scans a bar code image,
converts it into an alphanumeric value, which is then fed to the computer to which
bar code reader is connected.
o Optical Mark Reader (OMR) - OMR is a special type of optical scanner used to
recognize the type of mark made by pen or pencil. It is used where one out of a
few alternatives is to be selected and marked. It is specially used for checking the
answer sheets of examinations having multiple choice questions.
o Voice Systems - Following are the various types of input devices which are used
in multimedia systems.
8
COM415
Microphone- Microphone is an input device to input sound that is then
stored in digital form. The microphone is used for various applications like
adding sound to a multimedia presentation or for mixing music.
o Digital Camera - Digital camera is an input device to input images that is then
stored in digital form. The digital camera is used for various applications like
adding images to a multimedia presentation or for personal purposes.
9
COM415
Output Devices - Following are few of the important output devices, which are used in
Computer Systems:
o Monitors - Monitor commonly called as Visual Display Unit (VDU) is the main
output device of a computer. It forms images from tiny dots, called pixels, that are
arranged in a rectangular form. The sharpness of the image depends upon the
number of the pixels. There are two kinds of viewing screen used for monitors:
Cathode-Ray Tube (CRT) Monitor- In the CRT, display is made up of
small picture elements called pixels for short. The smaller the pixels, the
better the image clarity or resolution. It takes more than one illuminated
pixel to form whole character, such as the letter 'e' in the word help. A
finite number of characters can be displayed on a screen at once. The
screen can be divided into a series of character boxes - fixed location on
the screen where a standard character can be placed. Most screens are
capable of displaying 80 characters of data horizontally and 25 lines
vertically.
Printers - Printer is the most important output device, which is used to print information
on paper.
10
COM415
o Dot Matrix Printer- In the market, one of the most popular printers is Dot Matrix
Printer because of their ease of printing features and economical price. Each
character printed is in form of pattern of Dot's and head consists of a Matrix of
Pins of size (5*7, 7*9, 9*7 or 9*9) which comes out to form a character that is
why it is called Dot Matrix Printer.
o Daisy Wheel- Head is lying on a wheel and Pins corresponding to characters are
like petals of Daisy (flower name) that is why it is called Daisy Wheel Printer.
These printers are generally used for word-processing in offices which require a
few letters to be send here and there with very nice quality representation.
o Line Printers- Line printers are printers, which print one line at a time.
o Laser Printers- These are non-impact page printers. They use laser lights to
produce the dots needed to form the characters to be printed on a page.
11
COM415
of printing modes available. Colour printing is also possible. Some models of
Inkjet printers can produce multiple copies of printing also.
Screen Image Projector - Screen image projector or simply projector is an output device
used to project information from a computer on a large screen so that a group of people
can see it simultaneously. A presenter first makes a PowerPoint presentation on the
computer. Now a screen image projector is plugged to a computer system and presenter
can make a presentation to a group of people by projecting the information on a large
screen. Projector makes the presentation more understandable.
Speakers and Sound Card - Computers need both a sound card and speakers to hear
audio, such as music, speech and sound effects. Most motherboards provide an on-board
sound card. This built-in-sound card is fine for the most purposes. The basic functions of
a sound card are that it converts digital sound signals to analog for speakers making it
louder or softer.
Multimedia Software
Multimedia software tells the hardware what to do. For example, multimedia software tells the
hardware to display the color blue, play the sound of cymbals crashing etc. To produce these
media elements (movies, sound, text, animation, graphics etc.) there are various software
available in the market such as Paint Brush, Photo Finish, Animator, Photo Shop, 3D Studio,
Corel Draw, Sound Blaster, IMAGINET, Apple Hyper Card, Photo Magic, Picture Publisher.
12
COM415
Multimedia Software Categories
Following are the various categories of Multimedia software
Device Driver Software- This software’s are used to install and configure the multimedia
peripherals.
Media Players- Media players are applications that can play one or more kind of
multimedia file format.
Media Conversion Tools- These tools are used for encoding / decoding multimedia
contexts and for converting one file format to another.
Multimedia Editing Tools- These tools are used for creating and editing digital
multimedia data.
Multimedia Authoring Tools- These tools are used for combing different kinds of media
formats and deliver them as multimedia contents.
Multimedia Application:
Multimedia applications are created with the help of following mentioned tools and packages.
The sound, text, graphics, animation and video are the integral part of multimedia software. To
produce and edit these media elements, there are various software tools available in the market.
The categories of basic software tools are:
Text Editing Tools- These tools are used to create letters, resumes, invoices, purchase
orders, user manual for a project and other documents. MS-Word is a good example of
text tool. It has following features:
o Creating new file, opening existing file, saving file and printing it.
o Insert symbol, formula and equation in the file.
o Correct spelling mistakes and grammatical errors.
o Align text within margins.
o Insert page numbers on the top or bottom of the page.
o Mail-merge the document and making letters and envelopes.
o Making tables with variable number of columns and rows.
Painting and Drawing Tools- These tools generally come with a graphical user interface
with pull down menus for quick selection. You can create almost all kinds of possible
shapes and resize them using these tools. Drawing file can be imported or exported in
many image formats like .gif, .tif, .jpg, .bmp, etc. Some examples of drawing software are
Corel Draw, Freehand, Designer, Photoshop, Fireworks, Point etc. These softwares have
following features:
o Tools to draw a straight line, rectangular area, circle etc.
13
COM415
o Different colour selection option.
o Pencil tool to draw a shape freehand.
o Eraser tool to erase part of the image.
o Zooming for magnified pixel editing.
Image Editing Tools- Image editing tools are used to edit or reshape the existing images
and pictures. These tools can be used to create an image from scratch as well as images
from scanners, digital cameras, clipart files or original artwork files created with painting
and drawing tools. Examples of Image editing or processing software are Adobe
Photoshop and Paint Shop Pro.
Sound Editing Tools- These tools are used to integrate sound into multimedia project
very easily. You can cut, copy, paste and edit segments of a sound file by using these
tools. The presence of sound greatly enhances the effect of a mostly graphic presentation,
especially in a video. Examples of sound editing software tools are: Cool Edit Pro, Sound
Forge and Pro Tools. These softwares have following features:
o Record your own music, voice or any other audio.
o Record sound from CD, DVD, Radio or any other sound player.
o You can edit, mix the sound with any other audio.
o Apply special effects such as fade, equalizer, echo, reverse and more.
Video Editing Tools- These tools are used to edit, cut, copy, and paste your video and
audio files. Video editing used to require expensive, specialized equipment and a great
deal of knowledge. The aritistic process of video editing consists of deciding what
elements to retain, delete or combine from various sources so that they come together in
an organized, logical and visually planning manner. Today computers are powerful
enough to handle this job, disk space is cheap and storing and distributing your finished
work on DVD is very easy. Examples of video editing software are Adobe Premiere and
Adobe After Effects.
Animation and Modeling Tools- An animation is to show the still images at a certain rate
to give it visual effect with the help of Animation and modeling tools. These tools have
features like multiple windows that allow you to view your model in each dimension,
ability to drag and drop primitive shapes into a scene, color and texture mapping, ability
to add realistic effects such as transparency, shadowing and fog etc. Examples of
Animations and modeling tools are 3D studio max and Maya.
14
COM415
Image data types
Images can be created by using different techniques of representation of data called data type like
monochrome and colored images. Monochrome image is created by using single color whereas
colored image is created by using multiple colors. Some important data types of images are
following:
1-bit images- An image is a set of pixels. Note that a pixel is a picture element in digital
image. In 1-bit images, each pixel is stored as a single bit (0 or 1). A bit has only two
states either on or off, white or black, true or false. Therefore, such an image is also
referred to as a binary image, since only two states are available. 1-bit image is also known
as 1-bit monochrome images because it contains one color that is black for off state and
white for on state.
A 1-bit image with resolution 640*480 needs a storage space of 640*480 bits.
640 x 480 bits. = (640 x 480) / 8 bytes = (640 x 480) / (8 x 1024) KB= 37.5KB.
The clarity or quality of 1-bit image is very low.
8-bit Gray level images- Each pixel of 8-bit gray level image is represented by a single
byte (8 bits). Therefore, each pixel of such image can hold 28=256 values between 0 and
255. Therefore, each pixel has a brightness value on a scale from black (0 for no brightness
or intensity) to white (255 for full brightness or intensity). For example, a dark pixel might
have a value of 15 and a bright one might be 240.
A grayscale digital image is an image in which the value of each pixel is a single sample,
which carries intensity information. Images are composed exclusively of gray shades,
which vary from black being at the weakest intensity to white being at the strongest.
Grayscale images carry many shades of gray from black to white. Grayscale images are
also called monochromatic, denoting the presence of only one (mono) color (chrome). An
image is represented by bitmap. A bitmap is a simple matrix of the tiny dots (pixels) that
form an image and are displayed on a computer screen or printed.
A 8-bit image with resolution 640 x 480 needs a storage space of 640 x 480 bytes=(640 x
480)/1024 KB= 300KB. Therefore an 8-bit image needs 8 times more storage space than
1-bit image.
24-bit color images - In 24-bit color image, each pixel is represented by three bytes,
usually representing RGB (Red, Green and Blue). Usually true color is defined to mean
256 shades of RGB (Red, Green and Blue) for a total of 16777216 color variations. It
provides a method of representing and storing graphical image information an RGB color
space such that a colors, shades and hues in large number of variations can be displayed
in an image such as in high quality photo graphic images or complex graphics.
Many 24-bit color images are stored as 32-bit images, and an extra byte for each pixel
used to store an alpha value representing special effect information.
A 24-bit color image with resolution 640 x 480 needs a storage space of 640 x 480 x 3
bytes = (640 x 480 x 3) / 1024=900KB without any compression. Also 32-bit color image
with resolution 640 x 480 needs a storage space of 640 x 480 x 4 bytes= 1200KB without
any compression.
15
COM415
Disadvantages
o Require large storage space
o Many monitors can display only 256 different colors at any one time. Therefore,
in this case it is wasteful to store more than 256 different colors in an image.
8-bit color images - 8-bit color graphics is a method of storing image information in a
computer's memory or in an image file, where one byte (8 bits) represents each pixel. The
maximum number of colors that can be displayed at once is 256. 8-bit color graphics are
of two forms. The first form is where the image stores not color but an 8-bit index into the
color map for each pixel, instead of storing the full 24-bit color value. Therefore, 8-bit
image formats consists of two parts: a color map describing what colors are present in the
image and the array of index values for each pixel in the image. In most color maps each
color is usually chosen from a palette of 16,777,216 colors (24 bits: 8 red, 8green, 8 blue).
The other form is where the 8-bits use 3 bits for red, 3 bits for green and 2 bits for blue.
This second form is often called 8-bit true color as it does not use a palette at all. When a
24-bit full color image is turned into an 8-bit image, some of the colors have to be
eliminated, known as color quantization process.
A 8-bit color image with resolution 640 x 480 needs a storage space of 640 x 480
bytes=(640 x 480) / 1024KB= 300KB without any compression.
Color lookup tables
Short for color lookup table, CLUT is a method used to transform a range of input colors into
another range of colors. In some cases, a CLUT is a function incorporated in an image
processing application. Color lookup tables are also used in graphics cards and graphics formats,
like GIFs.
For example, a device (e.g., video card) could take colors stored in each pixel of video memory
and convert them into visible colors on a computer monitor or another display device.
A color loop-up table (LUT) is a mechanism used to transform a range of input colors into another
range of colors. Color look-up table will convert the logical color numbers stored in each pixel of
video memory into physical colors, represented as RGB triplets, which can be displayed on a
computer monitor. Each pixel of image stores only index value or logical color number. For
example, if a pixel stores the value 30, the meaning is to go to row 30 in a color look-up table
(LUT). The LUT is often called a Palette.
Characteristic of LUT are following:
The number of entries in the palette determines the maximum number of colors which can
appear on screen simultaneously.
The width of each entry in the palette determines the number of colors which the wider
full palette can represent.
A common example would be a palette of 256 colors that is the number of entries is 256 and thus
each entry is addressed by an 8-bit pixel value. Each color can be chosen from a full palette, with
16
COM415
a total of 16.7 million colors that is each entry is of 24 bits and 8 bits per channel which sets the
total combinations of 256 levels for each of the red, green and blue components 256 x 256 x 256
=16,777,216 colors.
Image file formats
GIF- Graphics Interchange Formats- The GIF format was created by CompuServe. It
supports 256 colors. GIF format is the most popular on the Internet because of its compact
size. It is ideal for small icons used for navigational purpose and simple diagrams. GIF
creates a table of up to 256 colors from a pool of 16 million. If the image has less than
256 colors, GIF can easily render the image without any loss of quality. When the image
contains more colors, GIF uses algorithms to match the colors of the image with the palette
of optimum set of 256 colors available. Better algorithms search the image to find and the
optimum set of 256 colors.
Thus GIF format is lossless only for the image with 256 colors or less. In case of a rich,
true color image GIF may lose 99.998% of the colors. GIF files can be saved with a
maximum of 256 colors. This makes it is a poor format for photographic images.
GIFs can be animated, which is another reason they became so successful. Most animated
banner ads are GIFs. GIFs allow single bit transparency that is when you are creating your
image, you can specify which color is to be transparent. This provision allows the
background colors of the web page to be shown through the image.
JPEG- Joint Photographic Experts Group- The JPEG format was developed by the
Joint Photographic Experts Group. JPEG files are bitmapped images. It store information
as 24-bit color. This is the format of choice for nearly all photograph images on the
internet. Digital cameras save images in a JPEG format by default. It has become the main
graphics file format for the World Wide Web and any browser can support it without plug-
ins. In order to make the file small, JPEG uses lossy compression. It works well on
photographs, artwork and similar materials but not so well on lettering, simple cartoons
or line drawings. JPEG images work much better than GIFs. Though JPEG can be
interlaced, still this format lacks many of the other special abilities of GIFs, like
animations and transparency, but they really are only for photos.
PNG- Portable Network Graphics- PNG is the only lossless format that web browsers
support. PNG supports 8 bit, 24 bits, 32 bits and 48 bits data types. One version of the
format PNG-8 is similar to the GIF format. But PNG is the superior to the GIF. It produces
smaller files and with more options for colors. It supports partial transparency also. PNG-
24 is another flavor of PNG, with 24-bit color supports, allowing ranges of color akin to
high color JPEG. PNG-24 is in no way a replacement format for JPEG because it is a
lossless compression format. This means that file size can be rather big against a
comparable JPEG. Also PNG supports for up to 48 bits of color information.
TIFF- Tagged Image File Format- The TIFF format was developed by the Aldus
Corporation in the 1980 and was later supported by Microsoft. TIFF file format is widely
used bitmapped file format. It is supported by many image editing applications, software
used by scanners and photo retouching programs.
TIFF can store many different types of image ranging from 1-bit image, grayscale image,
8-bit color image, 24 bit RGB image etc. TIFF files originally use lossless compression.
Today TIFF files also use lossy compression according to the requirement. Therefore, it
is a very flexible format. This file format is suitable when the output is printed. Multi-
17
COM415
page documents can be stored as a single TIFF file and that is way this file format is so
popular. The TIFF format is now used and controlled by Adobe.
BMP- Bitmap- The bitmap file format (BMP) is a very basic format supported by most
Windows applications. BMP can store many different type of image: 1 bit image,
grayscale image, 8 bit color image, 24 bit RGB image etc. BMP files are uncompressed.
Therefore, these are not suitable for the internet. BMP files can be compressed using
lossless data compression algorithms.
EPS- Encapsulated Postscript- The EPS format is a vector based graphic. EPS is popular
for saving image files because it can be imported into nearly any kind of application. This
file format is suitable for printed documents. Main disadvantage of this format is that it
requires more storage as compare to other formats.
PDF- Portable Document Format- PDF format is vector graphics with embedded pixel
graphics with many compression options. When your document is ready to be shared with
others or for publication. This is only format that is platform independent. If you have
Adobe Acrobat you can print from any document to a PDF file. From illustrator you can
save as .PDF.
EXIF- Exchange Image File- Exif is an image format for digital cameras. A variety of
tage are available to facilitate higher quality printing, since information about the camera
and picture - taking condition can be stored and used by printers for possible color
correction algorithms.it also includes specification of file format for audio that
accompanies digital images.
WMF- Windows MetaFile- WMF is the vector file format for the MS-Windows
operating environment. It consists of a collection of graphics device interface function
calls to the MS-Windows graphic drawing library. Metafiles are both small and flexible,
hese images can be displayed properly by their proprietary software only.
PICT- PICT images are useful in Macintosh software development, but you should avoid
them in desktop publishing. Avoid using PICT format in electronic publishing-PICT
images are prone to corruption.
Photoshop- This is the native Photoshop file format created by Adobe. You can import
this format directly into most desktop publishing applications.
PICTURES FORMATTING
Image editing refers to modifying or improving digital or traditional photographic images using
different techniques, tools or software. Images produced by scanners, digital cameras or other
image-capturing devices may be good, but not perfect. Image editing is done to create the best
possible look for the images and also to improve the overall quality of the image according to
different parameters.
https://edu.gcfglobal.org/en/imageediting101/image-editing-software/1/
https://edu.gcfglobal.org/en/word/formatting-pictures/1/
18
COM415
VISUAL COMMUNICATION
Visual communication is a sub-field within the discipline of communication that examines how
information is conveyed through visual means. People working in this field use images and design
elements to connect with audiences, often in an attempt to persuade or entertain them. Interested
individuals can find options for learning about specific types of visual communications at a variety
of levels from associate's through graduate degree programs. Studying visual communication can
lead to careers in graphic design, advertising, public relations, and mass media.
Visual communication is the practice of using visual elements to convey a message, inspire
change, or evoke emotion.
Infographics
Process Diagrams
Flow Charts
Roadmaps
Charts and Graphs
Visual Reports
Presentations
Mind Maps
Manufacturing process of object can also be controlled through CAD. Interactive graphics methods
are used to layout the buildings. Three-dimensional interior layouts and lighting also provided
the virtual-reality systems, the designers can go for a simulated walk inside the building.
Presentation Graphics
It is used to produce illustrations for reports or to generate 35-mm slides for use with projectors.
Examples of presentation graphics are bar charts, line graphs, surface graphs, pie charts and
displays showing relationships between parameters. 3-D graphics can provide more attraction to
the presentation.
INTERACTIVE MEDIA
Interactive media is a method of communication in which the program's outputs depend on the
user's inputs, and the user's inputs, in turn, affect the program's outputs. Simply put, it refers to the
different ways in which people process and share information, or how they communicate with one
another. Interactive media allows people to connect with others—whether that's people or
organizations—making them active participants in the media they consume.
Key Takeaways
Interactive media refers to the different ways in which people process and share
information.
Interactive media is meant to engage the user and interact with them in a way non-
interactive media does not.
Television and radio are the most common examples of non-interactive media.
Examples of interactive media include social media, virtual reality, and apps.
Interactive media was brought upon by the internet revolution of the 90s and improved
technology, such as smartphones.
The uses of interactive media are far and wide, from education to networking to video
games.
But with the advent of the internet in the 1990s, that began to change. As technology developed,
consumers were given different tools through which interactive media was presented. Access to
the internet went from an expensive utility once available only through dial-up to a wireless tool
accessible by the touch of a finger.
Computers and laptops became a household item and a necessity in the workplace,
and smartphones began making interacting with media easy and convenient
20
COM415
Elements of Interactive Media
Unlike traditional media, interactive media is meant to enhance a user's experience. In order to do
so, an interactive medium will require one more of the following elements:
A user can participate by manipulating one or more of these elements during their experience,
something traditional media does not offer.
Interactive media also has an educational component, making it a very powerful learning tool. It
allows (and encourages) people—especially students—to become more active in their learning
experience, more collaborative, and to be more in control of what they're learning.
Interactive media has changed the way people traditionally approached various personal and
professional aspects of life, such as searching for a job, interviews, going to school, and
advertising.
Social networking websites such as Facebook, Twitter, and Instagram are examples of interactive
media. These sites use graphics and text to allow users to share photos and information about
themselves, chat, and play games.
Video games are another type of interactive media. Players use controllers to respond to visual and
sound cues on the screen that are generated by a computer program.
If you have a mobile device like a smartphone—and chances are that you do—you use apps or
applications. These forms of interactive media can help you figure out the weather, direct you to
the desired location, choose and respond to news stories in which you are interested, and allow
you to shop. The possibilities are endless.
Another form of interactive media is virtual reality, or VR. VR gives users a completely immersive
experience, allowing them to delve into a world that is an almost carbon copy of reality. The only
difference is that this world is digital.
21
COM415
As technology becomes more advanced, interactive media will become even more immersive,
broadening what people are able to do. After all, smartphones and the internet are fairly recent
inventions.
Ergonomics
The term ergonomics or human factors is traditionally related to the study of interaction of
physical characteristics—design of controls, physical environment in which the interaction takes
place, arrangement, and the physical properties of display. The primary focus is on the user’s
performances and how interface affects them. In order to assess these aspects of interaction,
ergonomics will certainly touch on human psychology and systems’ limitations.
Ways of interaction
The interaction can be observed as a dialog between the user and the computer. The choice of
interface style can have a profound effect on the nature of a dialog. There is a great number of
common interface styles including:
INTERACTION DESIGN
Some of the interactions between humans and computers (or machines or technology) focus on
understanding, which means that the attention is paid to the way how people interact with
technology. However, a great deal of interaction between man and computer refers to how things
work and how they are created. The credits for these features go to design.
In this part, attention will be paid to the interaction design or design interactivity. Interaction
design is not just an artifact that is produced, regardless of whether it is a physical device or a
computer program. Apart from that fact, the artifacts do not give people only these devices and
programs but also guides, tutorials, and online help systems. In some cases, it may be understood
that no additional system is necessary for all elements, but it is probably easier to propose a
different way of using existing tools.
When someone is asked what design is, simple definition might be that the design is related to the
achievement of objectives within the constraints. This definition does not say everything about the
design, but it helps users to focus on the following elements:
Objectives—What is the purpose and design of future product? For whom is it made? Why
do they want it?
22
COM415
Limitations—What materials should be used? What standards should be adopted? How
much will it cost? How much time is needed to develop the product? Are there any health
and safety issues?
Exchange (compromise)—One should choose and define the objectives or restrictions
which may be adopted in a milder form, and limits must be respected to the smallest detail.
It is impossible to accomplish all of the user’s objectives within constraints, but in life, everything
is a matter of compromise, even in such cases. The best designs are created in areas where the
designer understands the compromises and the factors affecting them.
Design process
Here is a brief overview of the simplified view of the four major phases focused on interaction
design and interactivity, as well as supporting iteration loop:
One man cannot read and look at all the required techniques. Time is limited and there is no link
between the period of design and quality of the final design. This means that a design should be
accepted as final, even if it is not perfect; it is often better to have a product which is acceptable,
is done on time, and costs less than to have one that has perfect interaction but was not done on
time and was over a budget. For example, if a user encounters a system that appears to be perfect,
one can be pretty sure that it is a poorly designed system; the system is poorly designed, not
because the design is bad but because a lot of effort has been spent for the design process and
designing.
INTERACTIVE MULTIMEDIA
Interactive multimedia has been called a "hybrid technology." It combines the storage and retrieval
capabilities of computer database technology with advanced tools for viewing and manipulating
these materials. Multimedia has a lot of different connotations, and definitions vary depending on
the context. For the purposes of this Guide, in the context of upper secondary and postsecondary
education, interactive multimedia is defined by three criteria:
23
COM415
Interactive Multimedia is package of materials that includes some combination of texts,
graphics, still images, animation, video, and audio;
These materials are packaged, integrated, and linked together in some way that offers users
the ability to browse, navigate and analyze these materials through various searching and
indexing features, as well as the capacity to annotate or personalize these materials;
Hypermedia
The term hypermedia" was coined to mean a hypertext that uses mutliple media. In other words,
hypermedia is a collection of multimedia materials with multiple possible arrangements and
sequences. Hypertext and hypermedia are "electronic" concepts that can only exist in a computer-
based environment. Only in a computer-based environment can materials can be linked and
organized in multiple ways simultaneously, and searched, sorted and navigated in hundreds of
possible combinations by different users.
Imagine, for example, a large comprehensive textbook on the history of the United States. In a
sense, a print version of that textbook is already "multimedia": that is, in addition to text, it might
have pictures, maps, graphs, charts, timelines; furthermore, the text is made up of many different
texts, being a combination of words written by the author, quotations from historical figures,
perhaps commentary by other historians, and so on. But while the textbook could be thought of as
a text using multiple media and materials, it is not a multimedia hypertext (or hypermedia) because,
as a printed book, it can only be arranged in one order; its materials can only be accessed in the
one way that the author and the publisher arranged them. True, a reader can access the print text
in non-linear ways by using the index at the back of the book, or by jumping around. Still, the text
itself has only one arrangement and one hierarchy of topics; and the reader's ability to navigate the
materials is limited by the table of contents and the index.
Furthermore, a printed history textbook is limited by the constraints of size and practicality. Only
so much information can comfortably fit between two covers of a printed book. Such practical
issues have important consequences for the kinds of materials that go into the hands of readers.
Limitation of size means that it is more practical to write history books that synthesize and make
reference to large bodies of historical documents without being able to include very much or any
of the documents, themselves--even though, for the historian, such documents are part of the vital
material of history.
24
COM415
Now imagine a history textbook in electronic form, constructed as a work of hypermedia: how
would that work be different from a printed text? First, you could fill that "book" with a far greater
number of materials than you could fit between two covers of a printed book (a CD-ROM compact
disk, for example, can hold the equivalent of 300,000 pages of printed material). Second, you could
have a book that was truly "multimedia" in that, in addition to text, photographs, charts, and
timelines, you could have audio (such as folksong recordings, famous speeches), and video (such
as newsreels, film clips). Third, an electronic textbook could be constructed in an entirely different
way from a printed textbook: it could have dozens of potential organizations, and thousands of
internal linkages that could take the reader from one related idea to the next, in ways that would
infinitely vary depending on the context of the reading experience and the interests of the reader.
Consequently, the structure of such a "text" would not be limited to the single storyline or synthesis
offered by the author, but would become an intricate web of interrelationships, something
approaching the complexity of history. In discussing the transformations that ensued in turning the
print version of the history book Who Built America? into a multimedia CD-ROM, editors Roy
Rosenzweig and Steve Brier noted that the very nature of the "book" changed:
The 'spine of this computer book is a basic survey of American history from 1876-1914. .
. . Added to--and in the process transforming--this textual survey are nearly two hundred
'excursions,' which branch off from the main body of the text. Those excursions contain
about seven hundred source documents in various media that allow students as well as
interested general readers to go beyond (and behind) the printed page and to immerse
themselves in the primary and secondary sources that professional historians use to make
sense of the past.
In the process of making the CD-ROM Who Built America?, by newly linking large amounts of
interrelated materials, a print textbook became an engaging and versatile, multimedia archive of
information. These new kinds of multimedia resources consequently can serve multiple purposes
for many different users. Teachers could use such a text as a resource tool, gathering background
information for class lectures and discovering primary documents to enrich assignments; simliarly,
students, at all levels of capability, could use such a resource to begin the discovery process about
historical meaning and materials.
We've seen that interactive multimedia, by definition, has the capacity to deliver large amounts of
materials in multiple forms, and to deliver them in an integrated environment that allows users to
control the reading and viewing experience. How then do these defining characteristics and virtues
translate into benefits in an educational environment?
First of all, multimedia programs bring to education the extraordinary storage and delivery
capabilities of computerized material. This is especially important for schools, libraries, and
learning institutions where books are difficult to obtain and update. Multimedia is a powerful and
efficient source for acquiring learning resources. Multimedia can also provide educational
institutions access to other kinds of inaccessible materials, such as hard to find historical films,
rare sound recordings of famous speeches, illustrations from difficult to obtain periodicals, and so
on. Multimedia can put primary and secondary source materials at the fingertips of users in even
the remotest locations from major research facilities.
25
COM415
Secondly, it is not just sheer access to these materials that makes multimedia a powerful tool, but
the control over those materials that it gives to its users. Interactive multimedia programs enable
the user to manipulate these materials through a wide variety of powerful linking, sorting,
searching and annotating activities. Each of these activities can be made to reinforce and inculcate
various intellectual skills, in addition to satisfying certain cognitive needs for quality learning,
such as the ability to follow through links at the immediate moment when curiosity is aroused, and
the ability to view different forms of the same information side-by-side.
By allowing users to control the sequence and the pacing of the materials, multimedia packages
facilitate greater individualization in learning, allowing students to proceed at their own pace in a
tailored learning environment. Furthermore, interactive multimedia can be a powerful learning and
teaching tool because it engages multiple senses. Students using multimedia are reading, seeing,
hearing, and actively manipulating materials. As one educator enthusiastically put it,
As humans, we seem hard-wired for multiple input. Consider that we remember only about
10% of what we read; 20%, if we hear it; %30, if we can see visuals related to what we're
hearing; %50, if we watch someone do something while explaining it; but almost 90%, if
we do the job ourselves-- if only as a simulation. In other words, interactive multimedia--
properly developed and properly implemented-- could revolutionize education. (Menn,
1993)
Although "revolutionize" may be a bit optimistic, interactive multimedia is a promising medium
for reinforcing, extending, and "supplementing" what goes on in the classroom with print
materials, lectures and classroom discussions.
I use the term "supplementing" quite intentionally, however, as the supplementary dimension of
multimedia materials is important to keep in mind. Incorporating multimedia into the curriculum
does not mean "throwing out the printed books." Most teachers who incorporate some kind of
interactive multimedia into their teaching do so to enhance printed materials and the core course
content. Multimedia materials help students and teachers by way of reinforcement and extension,
not substitition. What hypermedia provides is access to materials and unique personalized control
over them. In other words, interactive multimedia isn't about replacing books, but about replacing
the absence of books; hypermedia doesn t do what books do, but what books can t do.
Educational multimedia packages and programs come in the same range and variety as printed
textbooks, and all other teaching and reference materials. Some multimedia programs are broad
and comprehensive; some are more focused. Some programs address themselves to introductory
students in a particular subject; some are suitable for more advanced students, or for teachers and
scholars, or for the general public; and some work well at all ends of the spectrum depending how
they are used. Similarly, some multimedia packages are more focused on primary texts and their
26
COM415
contexts, while others are designed to bring the user into some kind of sophisticated contact with
many different kinds of materials and processes.
There are a lot of different, viable ways to categorize the different kinds of multimedia packages
that are currently available. This Guide and Bibliography uses the following five categories:
Web design uses principles (or elements) of design similar to the ones used in print publications:
layout, white space, font, typography, etc.. But unlike designing for print, web design is much
more dynamic. It allows designers the ability to display or embed interactive tools, applications,
video, audio and other media onto the webpage.
28
COM415
Authoring a webpage in the past required designers to understand HTML (hypertext mark up
language) coding. But now there are a number of software tools and WYSIWYG (What you see
is what you get) editors, which allow a novice web designer to design a simple webpage as easily
as formatting content in a word processing document.
Also with the introduction of Content Management Systems (CMS) and blogging software such
as Joomla) or [http://www.wordpress.org/ WordPress, anyone can design their own webpage
without any knowledge of HTML. In terms of e-learning, Learning Management Systems (LMS)
are a special type of CMS for online education; LMSs also have web design capabilities built-in.
Web Authoring
Authoring a webpage in the past required designers to understand HTML (hypertext mark up
language) coding. But now there are a number of software tools and WYSIWYG (What you see
is what you get) editors, which allow a novice web designer to design a simple webpage as easily
as formatting content in a word processing document.
Also with the introduction of Content Management Systems (CMS) and blogging software such
as Joomla) or [http://www.wordpress.org/ WordPress, anyone can design their own webpage
without any knowledge of HTML. In terms of e-learning, Learning Management Systems (LMS)
are a special type of CMS for online education; LMSs also have web design capabilities built-in.
Web authoring is the practice of creating web documents using modern web authoring software and tools.
Web authoring software is a type of desktop publishing tool that allows users to navigate the tricky
environment of HTML and web coding by offering a different kind of graphical user interface.
With web authoring tools, the end user can see a visual result that is a lot like the final project after it is
built. Web authoring tools are similar to HTML editors in that they typically allow toggling between an
HTML code view and a visual design. Some of these types of tools are also called WYSIWYG ("what you
see is what you get") editors because, again, they allow displaying something that looks like the final project
as the user is building it. The alternative is to hand code a project, which can be frustrating, confusing and
daunting for less experienced designers. There are many different tools available for web authoring that
help translate HTML coding for those who do not have as much experience with web code syntax.
Containing Block
Container can be in the form of page’s body tag, an all containing div tag. Without container
there would be no place to put the contents of a web page.
29
COM415
Logo
Logo refers to the identity of a website and is used across a company’s various forms of
marketing such as business cards, letterhead, brouchers and so on.
Naviagation
The site’s navigation system should be easy to find and use. Oftenly the anvigation is placed
rigth at the top of the page.
Content
The content on a web site should be relevant to the purpose of the web site.
Footer
Footer is located at the bottom of the page. It usually contains copyright, contract and legal
information as well as few links to the main sections of the site.
Whitespace
It is also called as negative space and refers to any area of page that is not covered by type or
illustrations.
30
COM415
One should be aware of the following common mistakes should always keep in mind:
Well-designed websites offer much more than just aesthetics. They attract visitors and help
people understand the product, company, and branding through a variety of indicators,
encompassing visuals, text, and interactions. That means every element of your site needs to
work towards a defined goal.
But how do you achieve that harmonious synthesis of elements? Through a holistic web design
process that takes both form and function into account. In order to design, build, and launch your
website, it's important to follow these steps:
1. Goal identification: Where I work with the client to determine what goals the new
website needs to fulfill. i.e., what its purpose is.
2. Scope definition: Once we know the site's goals, we can define the scope of the project.
I.e., what web pages and features the site requires to fulfill the goal, and the timeline for
building those out.
3. Sitemap and wireframe creation: With the scope well-defined, we can start digging
into the sitemap, defining how the content and features we defined in scope definition
will interrelate.
4. Content creation: Now that we have a bigger picture of the site in mind, we can start
creating content for the individual pages, always keeping search engine optimization
(SEO) in mind to help keep pages focused on a single topic. It's vital that you have real
content to work with for our next stage:
5. Visual elements: With the site architecture and some content in place, we can start
working on the visual brand. Depending on the client, this may already be well-defined,
but you might also be defining the visual style from the ground up. Tools like style tiles,
moodboards, and element collages can help with this process.
6. Testing: By now, you've got all your pages and defined how they display to the site
visitor, so it's time to make sure it all works. Combine manual browsing of the site on a
31
COM415
variety of devices with automated site crawlers to identify everything from user
experience issues to simple broken links.
7. Launch: Once everything's working beautifully, it's time to plan and execute your site
launch! This should include planning both launch timing and communication strategies
— i.e., when will you launch and how will you let the world know? After that, it's time to
break out the bubbly
PROJECT REPORT
A project report is simply a document that provides detail on the overall status of the
project or specific aspects of the project's progress or performance. Regardless of the
type of report, it is made up of project data based on economic, technical, financial,
managerial or production aspects.
Paper Organization
1. Abstract
The abstract is a brief summary of your work. It should be written to make the reader want to
read the rest of your paper. Briefly state the basic contents and conclusions of your paper: the
problem you are solving, why the reader should care about this problem, your unique solution
and/or implementation, and the main results and and contributions of your work.
2. Introduction
The introduction is the big picture of your work: what, why, and how. It includes a definition of
the problem you are solving, a high-level description of your solution including any novel
techniques and results you provide, and a summary of the main results of your paper. In
addition, motivates the problem you are solving (why should a reader find your work
important), and describes your contribution to the area (this may not be applicable to your
project).
The first paragraph of the introduction should contain all of this information in a very
high-level. Subsequent paragraphs should discuss in more detail the problem you are
solving, your solution, and your results and conclusions.
32
COM415
should not include a listing of any code you wrote. Only if your project is about
developing an algorithm or a new language, may code examples be appropriate here.
o Discussion of how your solution solves the problem.
5. Experimental Results demonstrating/proving your solution
o Explain the tests you performed (and why)
o Explain how you gathered the data and details of how your experiments were run (any
system/environment set up)
o Present your results
Choose quality over quantity; the reader will not be impressed with pages and pages of
graphs and tables, instead s/he wants to be convinced that your results show something
interesting and that your experiments support your conclusions.
o Discuss your results!
Explain/interpret your results (possibly compare your results to related work). Do not
just present data and leave it up to the reader to infer what the data show and why they
are interesting.
6. Conclusions & Future Directions for your work Conclude with the main ideas and results of your
work. Discuss ways in which your project could be extended...what's next? what are the
interesting problems and questions that resulted from your work?
7. A brief meta-discussion of your project Include two paragraphs in this section:
1. Discussion of what you found to be the most difficult and least difficult parts of your
project.
2. In what ways did your implementation vary from your proposal and why?
8. References
At the end of your paper is a Reference section. You must cite each paper that you have
referenced...your work is related to some prior work.
Each section of your paper should be organized as: high-level important points first, details
second, summarize high-level points last.
33
COM415
interface (look at the man page for xwd).
You should have a figure showing the high-level design of your implementation.
More detailed writing advice and guidelines can be found here: CS Research and Writing Guide
It is a form of mass media in which it shares the news or concerning pieces of information via the
printed form of publications. Print media is the oldest means by which people still share
information across an entire group of audiences. It only publishes info in a printed form (hard
copy) and then releases it to its users to make it more reader-friendly than the electronic media.
Some very popular types of print media include books, magazines, newspapers, etc. No live
reporting, live discussion, or live show is possible with the print media. It follows the methodology
of an interval update.
Magazines, newspapers, flyers, newsletters, scholarly journals and other materials that are
physically printed on paper are examples of print media
It is a form of mass media that shares news or any other pieces of information via an electronic
medium to all of the viewers and/or audiences. Electronic media serves as a very advanced means
of sharing data and info/ news. In this type of media, the publisher needs to upload the information
(or broadcast it)- after which any user can easily view it through their electronic mediums
(devices). Thus, it acts more user-friendly than print media.
Some of the most popular types of electronic media include mobile app news, television news,
desktop streams, and many more. As a matter of fact, electronic media makes it possible for its
users to hold live discussions, live updates, live reporting, etc. It is because this media makes use
of a method of an immediate update.
Examples of digital media include software, digital images, digital video, video games, web
pages and websites, social media, digital data and databases, digital audio such as MP3, electronic
documents and electronic books.
34
COM415
Difference Between Print Media and Electronic Media
Parameters Print Media Electronic Media
Meaning and Print media is a type of mass media Electronic media is a type of mass media
Definition that creates and distributes that creates and distributes content via the
(publishes) the content via printed electronic medium and devices associated
means and publications. with them.
Advancement It is comparatively an earlier type of It is comparatively a very advanced type of
mass media. mass media.
Literacy The owner needs to be literate to go Literacy is not the primary concern in the
through the written information in case of electronic media because a majority
the print media. However, they can of the content and information is in a visual,
still enjoy visual content without the playable form (audio and video). Thus, one
need to read it (like photo books). can listen or watch the information.
However, one needs literacy to read news,
ebooks, etc.
Time Required One needs more time for editing the You can quickly and easily update the
for Editing information and updating it on the information available on electronic media.
print media. The process is much easier in this case.
Availability It is not available to its targeted The availability of electronic media is
audience 24×7. It is rather available evergreen and 24×7. You get the
in particular instances- for example, information as soon as a publisher uploads
monthly comics, weekly magazines, it- for example, television news, news on
daily newspapers, etc. mobile apps, etc.
Space It occupies more space because a One can easily avail of the electronic media
Occupied p[erson needs to physically carry the anywhere at any given time by simply using
printed material around with them, an electronic device- such as a laptop or a
like taking a magazine or newspaper cellphone.
in a bag, etc.
Deadlines Deadlines always exist in the case of No such kinds of deadlines exist in the case
print media, and it depends on the of electronic media.
collection of the content that we
want. For instance, reporters collect
news from today and publish them
in tomorrow’s newspapers.
Live Content Print media does not allow its users Electronic media is the primary media used
to get access to live shows, concerts, by the audience to get access to live content
news, etc., with this medium. in the form of reports, discussions, debates,
news, etc.
Coverage Area The print media covers Electronic media can display all kinds of
comparatively lesser areas and information- from texts to photographs,
genres of content because the type audio, video, etc. Thus, it covers more areas,
of information it can display is very genres, and topics pretty conveniently and
limited. generously.
Reader- The print media is comparatively The electronic media is comparatively more
Friendly more reader-friendly. viewer-friendly.
35
COM415
The Principles of Design and Their Importance
Good design is possible without understanding the principles of design. But it may take a lot of
trial and error to create something that both looks good and creates an optimal user experience.
As already mentioned, there is no real consensus in the design community about what the main
principles of design actually are. That said, the following twelve principles are those mentioned
most often in articles and books
Contrast
One of the most common complaints designers have about client feedback often revolves around
clients who say a design needs to “pop” more. While that sounds like a completely arbitrary term,
what the client generally means is that the design needs more contrast.
Contrast refers to how different elements are in a design, particularly adjacent elements. These
differences make various elements stand out. Contrast is also a very important aspect of creating
accessible designs. Insufficient contrast can make text content in particular very difficult to read,
especially for people with visual impairments.
Balance
There are two basic types of balance: symmetrical and asymmetrical. Symmetrical designs layout
elements of equal weight on either side of an imaginary center line. Asymmetrical balance uses
elements of differing weights, often laid out in relation to a line that is not centered within the
overall design.
Emphasis
Emphasis deals with the parts of a design that are meant to stand out. In most cases, this means the
most important information the design is meant to convey.
Emphasis can also be used to reduce the impact of certain information. This is most apparent in
instances where “fine print” is used for ancillary information in a design. Tiny typography tucked
away at the bottom of a page carries much less weight than almost anything else in a design, and
is therefore deemphasized.
Proportion
36
COM415
Proportion is one of the easier design principles to understand. Simply put, it’s the size of elements
in relation to one another. Proportion signals what’s important in a design and what isn’t. Larger
elements are more important, smaller elements less.
Hierarchy
Hierarchy is another principle of design that directly relates to how well content can be processed
by people using a website. It refers to the importance of elements within a design. The most
important elements (or content) should appear to be the most important.
Hierarchy is most easily illustrated through the use of titles and headings in a design. The title of
a page should be given the most importance, and therefore should be immediately recognizable as
the most important element on a page. Headings and subheadings should be formatted in a way
that shows their importance in relation to each other as well as in relation to the title and body
copy.
Repetition
Repetition is a great way to reinforce an idea. It’s also a great way to unify a design that brings
together a lot of different elements. Repetition can be done in a number of ways: via repeating the
same colors, typefaces, shapes, or other elements of a design.
This article, for example, uses repetition in the format of the headings. Each design principle is
formatted the same as the others in this section, signaling to readers that they’re all of equal
importance and that they’re all related. Consistent headings unify these elements across the page.
Rhythm
The spaces between repeating elements can cause a sense of rhythm to form, similar to the way
the space between notes in a musical composition create a rhythm. There are five basic types of
visual rhythm that designers can create: random, regular, alternating, flowing, and progressive.
Random rhythms have no discernable pattern. Regular rhythms follow the same spacing between
each element with no variation. Alternating rhythms follow a set pattern that repeats, but there is
variation between the actual elements (such as a 1-2-3-1-2-3 pattern). Flowing rhythms follow
bends and curves, similar to the way sand dunes undulate or waves flow. Progressive rhythms
change as they go along, with each change adding to the previous iterations.
Rhythms can be used to create a number of feelings. They can create excitement (particularly
flowing and progressive rhythms) or create reassurance and consistency. It all depends on the way
they are implemented.
Pattern
Patterns are nothing more than a repetition of multiple design elements working together.
Wallpaper patterns are the most ubiquitous example of patterns that virtually everyone is familiar
with.
37
COM415
In design, however, patterns can also refer to set standards for how certain elements are designed.
For example, top navigation is a design pattern that the majority of internet users have interacted
with.
White Space
White space—also referred to as “negative space”— is the areas of a design that do not include
any design elements. The space is, effectively, empty.
Many beginning designers feel the need to pack every pixel with some type of “design” and
overlook the value of white space. But white space serves many important purposes in a design,
foremost being giving elements of the design room to breathe. Negative space can also help
highlight specific content or specific parts of a design.
It can also make elements of a design easier to discern. This is why typography is more
legible when upper and lowercase letters are used since negative space is more varied
around lowercase letters, which allows people to interpret them more quickly.
Movement
Movement refers to the way the eye travels over a design. The most important element should lead
to the next most important and so on. This is done through positioning (the eye naturally falls on
certain areas of a design first), emphasis, and other design elements already mentioned.
Unity
Everyone has seen a website or other design out there that seemed to just throw elements on a page
with no regard for how they worked together. Newspaper ads that use ten different fonts come to
mind almost immediately.
Unity refers to how well the elements of a design work together. Visual elements should have clear
relationships with each other in a design. Unity also helps ensure concepts are being communicated
in a clear, cohesive fashion. Designs with good unity also appear to be more organized and of
higher quality and authority than designs with poor unity.
Other principles of design are also touched upon in various articles on the subject. These include
typography, color, Gestalt Principles, grid and alignment, framing, and shape. Some definitely fit
the definition of “principles” while others are more like elements of design.
Typography refers to the way text is arranged in a design. That includes the fonts used, their
spacing, size, and weight, and the way different text elements relate to each other. Good
typographic design is heavily influenced by all of the other design principles mentioned earlier in
this article.
38
COM415
The use of color in design is one of the most psychologically important parts of a design and has
a huge influence on user experience. Color psychology and theory heavily influences some of the
other principles mentioned earlier.
Grid and alignment are closely related to balance and refer to the way elements are arranged in
relation to an invisible grid on the page.
Framing refers to how the primary subject of a design is placed in relation to other elements on the
page. It’s most often heard referred to in cinematography or photography, with how the main focus
of an image is placed within the overall image. But the principle carries over into design.
Shape is also a major part of any design, both in terms of specific shapes used as elements within
the design, and the overall shape of the design itself. Different shapes can evoke different feelings,
i.e circles are organic and fluid, while squares are more rigid and formal, and triangles give a sense
of energy or movement.
These design “principles” or elements are important aspects of good design and should be
considered alongside the other basic principles to create the best user experiences.
Conclusion
Designers should aim to understand how each of these design principles actually impact their work.
Studying how other designers have implemented these ideas to structure their own designs is also
an incredibly valuable tool in learning to create better designs.
It’s entirely possible to create a good design without a thorough understanding of these elements
and principles of design. However, it’s typically done by “designer’s intuition” and may take a lot
of trial and error in order to create something that actually looks good and creates an optimal user
experience. Designers could save a lot of time and energy by practicing the principles we have
discussed until they become second-nature.
39