Multimedia SYsytem Unit 1,2,3,4 Except Last 2 Parts

Download as pdf or txt
Download as pdf or txt
You are on page 1of 69

FACULTY OF SCIENCE

BA (Computer Applications)
SEMESTER – IV

Multi Media Systems


Internal marks = 20 Practical marks = 50 External Marks = 80

Unit - I
Multimedia: Introduction, Definitions, Where to Use Multimedia- Multimedia in Business, Schools, Home,
Public Places, Virtual Reality; Delivering Multimedia.
Text: Meaning, Fonts and Faces, Using Text in Multimedia, Computers and Text, Font Editing and Design
Tools, Hypermedia and Hypertext.
Images: Before We Start to Create, Making Still Images, Color.

Unit - II
Sound: The Power of Sound, Digital Audio, MIDI Audio, MIDI vs. Digital Audio, Multimedia System
Sounds, Audio File Formats, Adding Sound to Our Multimedia Project.
Animation: The Power of Motion, Principles of Animation, Animation by Computer, Making Animations.
Video: Using Video, How Video Works and Is Displayed, Digital Video Containers, Obtaining Video Clips,
Shooting and Editing Video.
Unit - III
Making Multimedia: The Stages of a Multimedia Project, the Intangibles, Hardware, Software, Authoring
Systems.
Designing and producing: designing the structure, designing the user interface, a multimedia design case
history, producing.

Unit - IV
The Internet and Multimedia: Internet History, Internetworking, Multimedia on the Web.
Designing for the World Wide Web: Developing for the Web, Text for the Web, Images for the Web,
Sound for the Web, Animation for the Web, Video for the Web.
Delivering: Testing, Preparing for Delivery, Delivering on CD-ROM, DVD and World Wide Web,
Wrapping.

Page 1 of 69
Unit – I

Multimedia: Introduction, Definitions, Where to Use Multimedia- Multimedia in Business, Schools, Home,

Public Places, Virtual Reality; Delivering Multimedia.

Text: Meaning, Fonts and Faces, Using Text in Multimedia, Computers and Text, Font Editing and Design

Tools, Hypermedia and Hypertext.

Images: Before We Start to Create, Making Still Images, Color.

Page 2 of 69
Unit -1
Chapter – 1
1. Introduction to Multimedia:
Multimedia refers to the integration of various forms of media elements, such as text, graphics,
audio, video, and interactive content, to convey information or entertainment. It involves the combination of
different media types to create a more engaging and complete experience for the audience. Multimedia can
be found in various aspects of our daily lives, including entertainment, education, business presentations,
and communication.
Definitions:
1. Broad Definition: Multimedia is the use of multiple forms of media to present information or
entertain an audience. It includes a range of media elements such as text, images, audio, video,
and interactive content.
2. Technical Definition: In a technical sense, multimedia involves the integration of different
media types through computer technology. This can include the use of software applications to
create, edit, and present multimedia content.
2. Where to Use Multimedia
1. Multimedia in Business:
 Presentations: Businesses use multimedia extensively for creating dynamic and engaging
presentations. This includes slideshows with images, videos, and audio to convey information
effectively.
 Marketing and Advertising: Multimedia is a key component of marketing strategies.
Companies use videos, animations, and interactive content for online advertisements, product
demonstrations, and promotional campaigns.
 Training and Development: Multimedia is employed for employee training programs. It
includes interactive training modules, simulations, and multimedia content to enhance
learning experiences.
 Web Design: Business websites often incorporate multimedia elements to make them
visually appealing and to convey information more effectively. This includes images, videos,
and interactive features.
2. Multimedia in Schools:

 Interactive Learning: Multimedia is used in educational settings to create interactive learning


materials. This includes e-learning courses, educational software, and multimedia presentations to
make learning more engaging.
 Visual Aids: Teachers use multimedia as visual aids in the classroom. This includes presentations,
educational videos, and interactive software to supplement traditional teaching methods.

Page 3 of 69
 Online Education: With the rise of online education, multimedia is integral to delivering course
content. It includes video lectures, interactive quizzes, and other multimedia elements for a richer
learning experience.
 Digital Libraries: Schools and educational institutions use multimedia to create digital libraries,
providing students with access to a variety of educational resources in different formats.
3. Multimedia at Home:

 Entertainment: Multimedia is a primary source of entertainment at home. It includes watching


movies, listening to music, playing video games, and accessing various forms of digital media.
 Communication: Video calls, multimedia messaging, and social media platforms heavily rely on
multimedia for communication and sharing experiences with friends and family.
 Information Consumption: Multimedia is used at home for consuming news, online articles, and
educational content. This includes reading articles, watching educational videos, and exploring
interactive content.
4. Multimedia in Public Places:

 Digital Signage: Public places such as malls, airports, and transit stations use multimedia for
digital signage. This includes displaying advertisements, announcements, and information on
digital screens.
 Interactive Kiosks: Multimedia kiosks are used in public places to provide information and
services. They may include touchscreens, videos, and interactive content to assist users.
 Museums and Exhibitions: Multimedia is commonly used in museums and exhibitions to
enhance visitor experiences. This includes interactive displays, audio guides, and multimedia
presentations.
 Events and Conferences: Multimedia plays a crucial role in events and conferences. This
includes presentations, live streaming, and interactive displays to engage attendees and convey
information effectively.
In summary, multimedia is universal across various settings, moving communication,
education, entertainment, and information consumption in business, schools, homes, and public
places.
5. Virtual Reality (VR):
Virtual reality (VR) is used in many industries, including:
 Entertainment: VR is used in video games, 3D cinema, amusement park rides, and social
virtual worlds.
 Healthcare: VR is used for training and surgery.
 Education: VR offers new methods for teaching and learning.
Page 4 of 69
 Architecture: VR is used for architecture and urban design.
 Digital marketing: VR is used for digital marketing.
 Engineering and robotics: VR is used for engineering and robotics.

VR can also be used for: Automotive, Space and military, Occupational safety, Retail, Real estate.
Multimedia and VR are supported by their interactivity. VR headsets can be used in physical environments
with special effects, or locally with an offline computer, game system, or simulator.
 Gaming: VR is extensively used in the gaming industry to provide immersive gaming experiences.
 Training Simulations: VR is utilized for training simulations in various fields, such as healthcare,
aviation, and military training.
 Virtual Tours: VR is used to create virtual tours for real estate, tourism, and historical sites.

3. Delivering Multimedia
Delivering multimedia involves the transmission or presentation of various forms of content, such as
text, audio, video, graphics, and animations, to a target audience. The method of delivery can vary
depending on the context and the intended purpose. Here are common ways to deliver multimedia:
1. Web Platforms:
 Websites: Multimedia content can be hosted on websites, providing users with the ability to access
and view content through web browsers. This includes streaming videos, interactive graphics, and
other multimedia elements.
2. Streaming Services:
 Video Streaming: Platforms like WeTube, Netflix, and Vimeo allow users to stream videos in real-
time. Live streaming services are also used for events, webinars, and online broadcasts.
 Audio Streaming: Services like Spotify, Apple Music, and podcasts deliver audio content in a
streaming format.
3. Social Media:
 Video and Image Sharing: Social media platforms like Facebook, Instagram, Twitter, and TikTok
enable users to share and view multimedia content, including videos, images, and animations.
4. Email and Newsletters:
 Multimedia Attachments: Multimedia content can be shared through email as attachments or
embedded within the email body. Newsletters often include multimedia elements for engagement.
5. Presentations and Slideshows:
 Microsoft PowerPoint, Google Slides: Multimedia is commonly used in presentations to enhance
communication. Slideshows can include images, videos, and audio elements.
6. Educational Platforms:

Page 5 of 69
 Learning Management Systems (LMS): Multimedia is delivered in educational settings through
online platforms, where students can access course materials, videos, interactive simulations, and
assessments.
7. Mobile Applications:
 Apps: Multimedia content is delivered through mobile applications, providing users with interactive
and engaging experiences on smartphones and tablets.
8. Digital Signage:
 Public Displays: Multimedia content is often delivered through digital signage in public places,
including malls, airports, and transportation hubs, for information, advertisements, and
announcements.
9. Virtual Reality (VR):
 VR Platforms: Virtual reality content is delivered through specialized platforms and applications,
offering immersive experiences in gaming, training, and virtual tours.
10.Broadcasting:
 Television and Radio: Traditional broadcasting methods are still relevant for delivering multimedia
content. TV and radio broadcasts include a mix of audio and visual elements.
11.CDs/DVDs and Physical Media:
 Physical Storage: Multimedia content can be distributed through CDs, DVDs, and other physical
storage media. This is less common in the age of digital delivery but is still used for certain purposes.
When delivering multimedia, it's essential to consider the target audience, the purpose of the
content, and the most suitable platform or method for reaching them effectively. Advances in technology
continue to influence how multimedia is created and delivered, with an emphasis on accessibility,
interactivity, and user engagement.

Chapter -2
Text in Multimedia
All multimedia content consists of texts in some form. Even a menu text is accompanied by a single action
such as mouse click, keystroke or finger pressed in the monitor (in case of a touch screen). The text in the
multimedia is used to communicate information to the user. Proper use of text and words in multimedia
presentation will help the content developer to communicate the idea and message to the user.

Meaning:
Words and symbols in any form, spoken or written, are the most common system of communication.
They deliver the most widely understood meaning to the greatest number of people. Most academic related
text such as journals, e-magazines are available in the Web Browser readable form.

Page 6 of 69
About Fonts and Faces in multimedia
In the context of multimedia, the terms "fonts" and "faces" are often used interchangeably, both referring to
the visual representation of characters and text. Here's an overview of these concepts and their significance
in multimedia design:
Fonts in Multimedia:
1. Definition:
 A font refers to a set of characters with a consistent style, size, and weight. It includes letters,
numbers, punctuation marks, and symbols that share a unified design.
 Typeface is family of graphic characters that usually includes many type sizes and styles. A font is a
collection of characters of a single size and style belonging to a particular typeface family. Typical
font styles are bold face and italic. Other style attributes such as underlining and outlining of
characters, may be added at the users choice.
 The size of a text is usually measured in points. One point is approximately 1/72 of an inch i.e.
0.0138. The size of a font does not exactly describe the height or width of its characters. This is
because the x-height (the height of lower case character x) of two fonts may differ.
 Typefaces of fonts can be described in many ways, but the most common characterization of a
typeface is serif and sans serif. The serif is the little decoration at the end of a letter stroke. Times,
Times New Roman, Bookman are some fonts which comes under serif category. Arial, Optima,
Verdana are some examples of sans serif font. Serif fonts are generally used for body of the text for
better readability and sans serif fonts are generally used for headings. The following fonts shows a
few categories of serif and sans serif fonts.

Selecting Text fonts


It is a very difficult process to choose the fonts to be used in a multimedia presentation. Following are a few
guidelines which help to choose a font in a multimedia presentation.
 As many number of type faces can be used in a single presentation, this concept of using many fonts
in a single page is called ransom-note topography.
 For small type, it is advisable to use the most legible font.
 In large size headlines, the kerning (spacing between the letters) can be adjusted
 In text blocks, the leading for the most pleasing line can be adjusted.
 Drop caps and initial caps can be used to accent the words.

Page 7 of 69
 The different effects and colors of a font can be chosen in order to make the text look in a distinct
manner.
 Anti-aliased can be used to make a text look gentle and blended.
 For special attention to the text the words can be wrapped onto a sphere or bent like a wave.
 Meaningful words and phrases can be used for links and menu items.
Using Text in Multimedia
PURPOSE
1. To guide the user in navigating through the application.
2. To explain how application work
3. Deliver the information for which the application was designed.
# Text consists of two structures:
 Linear
 Non-linear
# Linear:
 A single way to progress through the text, starting at the beginning and reading to the end.
# Non-linear:
 Information is represented in a semantic network in which multiple related sections of the next are
connected to each other.
 A user may then browser trough the section of the next, jumping from one text section to another
Why text is important?
# Factors affecting legibility of text
1. Size: the size of the text
2. Background and foreground color: The color in which the text is written in / on
3. Style: Also known as typeface and font.
4. Leading:
 Refers to the amount of added spaces between lines of type.
 Originally, when type was set by hand for printing presses, printers placed slugs, strips
of lead of various thicknesses, between lines of type to add space.
Text technology
1. Based on creating letters, numbers and special characters.
2. May also include special icon or drawing symbols, mathematical symbol, Greek letter etc[©™≈ƒ]
3. Text elements can be categories into:
 Alphabet characters: A-Z
 Numbers: 0-9
Page 8 of 69
 special characters: Punctuation [. , ; ‗ …..] , Sign or symbols [*&^%$#@!…..]
 Also known Character Sets.
FONT VS TYPEFACE
Font
1. A ‗font‘ is a collection of characters of a particular size and style belonging to a particular typeface
family.
2. Usually vary by type sizes and styles.
3. The sizes are measure in points
4. This includes the letter set, the number set, and all of the special character and diacritical marks we
get by pressing the shift, option, or command /control keys.
Typeface
1. A ‗typeface‘ is a family of graphic characters that usually includes many type sizes and styles.
2. A typeface contains a series of fonts. For instance, Arial, Arial Black, Arial Narrow and Arial
Unicode MS are actually 4 fonts under the same family.
 Arial
 Arial Black
 Arial Narrow
 Arial Unicode MS
Font Effects
The technology of font effects in bringing viewer‘s attention to content
 Case : UPPER and lower cased letter
 Bold, Italic, Underline, Superscript and Subscript
 Embossed or Shadow
 Colours
 Strikethrough

Page 9 of 69
Leading of Text
 Spacing above and below a font or line spacing
TYPES OF FONTS
 Two classes of fonts

SERIF TEXT
 Decorative strokes added to the end of a letter‘s
 Serifs improve readability by leading the eye along the line of type
 Serifs are the best suited for body text
 Serif faces are more difficult to read in small scale (smaller than 8pt) and in very large sizes.
SANS SERIF TEXT
 Sans serif faces doesn‘t have decorative strokes
 A sans serif text has to be read letter by letter.
 Use sans serif faces for small (smaller than 8pt) and very large sizes
 Used for footnotes and headlines
USING TEXT IN MULTIMEDIA
The text elements used in multimedia are:
 Menus for navigation
 interactive buttons
Page 10 of 69
 Fields for reading
 HTML documents
 Symbols and icon
TEXT APPLYING GUIDELINES:
 Be concise
 Use appropriate fonts
 Make it readable
 Consider type style and colors
 Use restraint and be consistent

Computers and text


Computers play a fundamental role in handling and processing text within multimedia systems. Text is a
versatile element in multimedia, and computers are responsible for managing, rendering, and manipulating
text to create engaging and dynamic multimedia experiences. Here's how computers interact with text in
multimedia systems:
1. Text Creation and Editing:
 Multimedia content often requires textual information, such as titles, subtitles, captions, and
body text. Computers enable the creation and editing of text through various software
applications, including text editors and graphic design tools.
2. Font Rendering and Typography:
 Computers are responsible for rendering fonts and managing typography. They use font
rendering engines to display text on screens, ensuring proper spacing, kerning, and other
typographic aspects.
3. Text Encoding and Formats:
 Computers handle text encoding and format to ensure compatibility across different systems
and platforms. Common text formats include ASCII, Unicode, and various markup languages
like HTML for web-based multimedia.
4. Animation and Effects:
 Multimedia systems often incorporate animated text or special effects to enhance visual
appeal. Computers use animation software and graphic processing capabilities to animate
text, apply transitions, and create dynamic effects.
5. Speech Synthesis and Recognition:
 Computers can convert text to speech (TTS) using speech synthesis technologies. This is
particularly useful in multimedia applications where narration or voiceovers are required.
Additionally, computers can perform text recognition for tasks like subtitle generation or
voice command interpretation.
Page 11 of 69
6. Text-to-Image Conversion:
 Multimedia systems may involve converting textual content into images or graphics.
Computers use design software to create visual representations of text, such as info graphics,
charts, or stylized text elements.
7. Text Search and Indexing:
 In multimedia databases or content management systems, computers handle text indexing and
search functionalities. This allows users to search and retrieve specific multimedia content
based on textual queries.
8. Interactive Text:
 Computers enable the creation of interactive text elements in multimedia, such as clickable
links, buttons, or text-based user interfaces. This enhances user engagement and interaction
within multimedia applications.
9. Text Compression:
 To optimize storage and transmission of multimedia content, computers may employ text
compression algorithms. This is particularly important when dealing with large volumes of
text data in multimedia databases or streaming applications.
10. Accessibility Features:
 Computers facilitate the implementation of accessibility features for text in multimedia,
including screen readers, closed captions, and subtitles. These features ensure that multimedia
content is accessible to users with diverse needs.
11. Rendering on Different Devices:
 Computers adapt text rendering based on the characteristics of different devices (e.g.,
desktops, tablets, and smartphones). Responsive design and adaptive text rendering are
critical for delivering consistent and readable text across various platforms.

Font Editing and Design tools


There is several software that can be used to create customized font. These tools help an multimedia
developer to communicate his idea or the graphic feeling. Using these software different typefaces can be
created.
In some multimedia projects it may be required to create special characters. Using the font editing
tools it is possible to create a special symbols and use it in the entire text.
Following is the list of software that can be used for editing and creating fonts:
 Fontographer
 Fontmonger
 Cool 3D text

Page 12 of 69
Special font editing tools can be used to make our own type so we can communicate an idea or
graphic feeling exactly. With these tools professional typographers create distinct text and display faces.

1. Fontographer:
It is macromedia product; it is a specialized graphics editor for both Macintosh and Windows
platforms. We can use it to create postscript, TrueType and bitmapped fonts for Macintosh and Windows.

2. Making Pretty Text:


To make our text look pretty we need a toolbox full of fonts and special graphics applications that
can stretch, shade, color and anti-alias our words into real artwork. Pretty text can be found in bitmapped
drawings where characters have been tweaked, manipulated and blended into a graphic image.

3. Hypermedia and Hypertext:


Multimedia is the combination of text, graphic, and audio elements into a single collection or
presentation – becomes interactive multimedia when we give the user some control over what information is
viewed and when it is viewed.
When a hypermedia project includes large amounts of text or symbolic content, this content can be
indexed and its element then linked together to afford rapid electronic retrieval of the associated
information. When text is stored in a computer instead of on printed pages the computer‘s powerful
processing capabilities can be applied to make the text more accessible and meaningful. This text can be
called as hypertext.

4. Hypermedia Structures:
Two Buzzwords used often in hypertext are link and node. Links are connections between the
conceptual elements, that is, the nodes that ma consists of text, graphics, sounds or related information in the
knowledge base.

5. Searching for words:


Following are typical methods for a word searching in hypermedia systems: Categories, Word
Relationships, Adjacency, Alternates, Association, Negation, Truncation, Intermediate words, Frequency.

Hypermedia and hypertext


Hypermedia and hypertext are concepts that play a significant role in multimedia systems, facilitating
interactive and interconnected content experiences. Both terms are closely related but have distinct
characteristics.

1. Hypertext:
 Definition: Hypertext refers to text that contains links (hyperlinks) to other texts, allowing
users to navigate between related pieces of information in a non-linear manner.

Page 13 of 69
 Usage: In multimedia systems, hypertext is commonly employed in websites, e-books, and
interactive presentations. Users can click on hyperlinks within the text to access additional
information, related content, or navigate to different sections.

2. Hypermedia:
 Definition: Hypermedia extends the concept of hypertext by including various types of
media beyond text, such as images, audio, video, and interactive elements, all interconnected
through hyperlinks.
 Usage: Hypermedia is prevalent in modern multimedia systems, where users can navigate not
only through text but also through a mix of media types. This enhances the overall user
experience and allows for more dynamic and engaging content.
Key features and considerations regarding hypermedia and hypertext in multimedia systems include:
 Interactivity: Hypermedia and hypertext provide interactivity by allowing users to make choices and
navigate through the content based on their preferences. This non-linear approach contrasts with
traditional linear media.
 Navigation Structure: Hypermedia systems often have a hierarchical or networked structure, with
nodes representing individual pieces of content interconnected through hyperlinks. This structure
allows users to move seamlessly between different nodes.
 Rich Media Integration: Hypermedia systems incorporate various media formats, enabling a richer
and more immersive experience. This can include images, audio, video, animations, and interactive
elements, all linked together to convey information more dynamically.
 User Engagement: The non-linear nature of hypermedia and hypertext encourages user engagement
and exploration. Users have the flexibility to choose their own path through the content, making the
experience more personalized.
 Educational and Informational Applications: Hypermedia is widely used in educational
multimedia, interactive tutorials, and informational databases. It allows for the creation of dynamic
learning environments where users can explore topics at their own pace.
 Web Browsing: The World Wide Web is a prime example of a hypermedia system. Web pages are
interconnected through hyperlinks, enabling users to navigate from one page to another, accessing a
variety of content types along the way.
 Authoring Tools: Various authoring tools and multimedia development platforms support the
creation of hypermedia content. These tools often provide features for linking different types of media
and designing interactive navigation.
 Cross-Platform Compatibility: Hypermedia content is designed to be accessible across different
devices and platforms. Web browsers, multimedia players, and other applications support the
rendering and interaction of hyperlinked content.

Page 14 of 69
Chapter - 3
Images
Introduction and Before we start to create:
Still images are the important element of a multimedia project or a web site. In order to make a
multimedia presentation look elegant and complete, it is necessary to spend ample amount of time to design
the graphics and the lawets. Competent, computer literate skills in graphic art and design are vital to the
success of a multimedia project.
Images play a crucial role in multimedia systems, enriching the overall user experience by adding
visual elements to complement other forms of media such as text, audio, and video. Here are some key
aspects related to images in multimedia systems:
Here are some key features related to images in multimedia systems:
1. Formats and Compression:
 Images can be stored in various formats such as JPEG, PNG, GIF, BMP, and others. Each
format has its own advantages and is suitable for different types of images and use cases.
 Compression techniques are often applied to reduce the file size of images, making them
more manageable for storage and faster to transmit over networks.
2. Color Models:
 Images are typically represented using color models such as RGB (Red, Green, Blue) or
CMYK (Cyan, Magenta, Yellow, Key/Black). The choice of color model depends on the
application and requirements of the multimedia system.
3. Resolution:
 The resolution of an image refers to the number of pixels it contains. Higher resolution
images generally provide more detail but may require more storage space and computational
resources.
4. Image Editing and Processing:
 Multimedia systems often incorporate image editing and processing capabilities to
manipulate and enhance images. This may include tasks such as cropping, resizing, filtering,
and applying various effects.
5. Integration with Other Media:
 Images are frequently combined with other forms of media, such as text, audio, and video, to
create a rich multimedia experience. Presentations, websites, and educational materials often
leverage a combination of media types.

Page 15 of 69
6. Multimedia Authoring Tools:
 Specialized software tools exist for creating multimedia content, allowing users to integrate
and synchronize different media types, including images. These tools often have features for
arranging, editing, and enhancing images within a multimedia project.
7. Interactive Multimedia:
 In interactive multimedia applications, users may have control over how they interact with
images. This can include zooming, panning, rotating, or clicking on specific areas for
additional information.
8. Virtual and Augmented Reality:
 In virtual reality (VR) and augmented reality (AR) applications, images are a fundamental
component, providing the visual content that users interact with in a simulated or enhanced
environment.
9. Accessibility:
 Multimedia systems need to consider accessibility, ensuring that images are described
adequately for users with visual impairments. This is often done through alternative text or
other accessibility features.
10.Streaming and Delivery:
 Images, especially in web-based multimedia systems, need to be delivered efficiently.
Techniques such as lazy loading and content delivery networks (CDNs) are employed to
optimize image loading times.

Before we start to create


Digital Image
A digital image is represented by a matrix of numeric values each representing a quantized intensity
value. When I is a two-dimensional matrix, then I(r,c) is the intensity value at the position corresponding to
row r and column c of the matrix.
The points at which an image is sampled are known as picture elements, commonly abbreviated as
pixels. The pixel values of intensity images are called gray scale levels (we encode here the ―color‖ of the
image). The intensity at each pixel is represented by an integer and is determined from the continuous image
by averaging over a small neighborhood around the pixel location. If there are just two intensity values, for
Example, black, and white, they are represented by the numbers 0 and 1; such images are called binary-
valued images. If 8-bit integers are used to store each pixel value, the gray levels range from 0 (black) to 255
(white).

Page 16 of 69
Digital Image Format
There are different kinds of image formats in the literature. We shall consider the image format that
comes out of an image frame grabber, i.e., the captured image format, and the format when images are
stored, i.e., the stored image format.

Captured Image Format


The image format is specified by two main parameters: spatial resolution, which is specified as
pixels x pixels (eg. 640×480) and color encoding, which is specified by bits per pixel. Both parameter values
depend on hardware and software for input/output of images.

Stored Image Format


When we store an image, we are storing a two-dimensional array of values, in which each value
represents the data associated with a pixel in the image. For a bitmap, this value is a binary digit.

Bitmaps
A bitmap is a simple information matrix describing the individual dots that are the smallest elements of
resolution on a computer screen or other display or printing device. A one-dimensional matrix is required for
monochrome (black and white); greater depth (more bits of information) is required to describe more than
16 million colors the picture elements may have, as illustrated in following figure. The state of all the pixels
on a computer screen make up the image seen by the viewer, whether in combinations of black and white or
colored pixels in a line of text, a photograph-like picture, or a simple background pattern.

Where do bitmap come from? How are they made?


 Make a bitmap from scratch with paint or drawing program.
 Take a bitmap from an active computer screen with a screen capture program, and then paste into a paint
program or our application.
 Capture a bitmap from a photo, artwork, or a television image using a scanner or video capture device
that digitizes the image. Once made, a bitmap can be copied, altered, e-mailed, and otherwise used in
many creative ways.

Page 17 of 69
Clip Art
A clip art collection may contain a random collection of images, or it may contain a series of
graphics, photographs, sound, and video related to a single topic. For example, Corel, Micrografx, and
Fractal Design bundle extensive clip art collection with their image-editing software.

Making Still Images


Still images may be small or large, or even full screen. Whatever their form, still images are
generated by the computer in two ways: as bitmap (or paint graphics) and as vector-drawn (or just plain
drawn) graphics.
Bitmaps are used for photo-realistic images and for complex drawing requiring fine detail. Vector-
drawn objects are used for lines, boxes, circles, polygons, and other graphic shapes that can be
mathematically expressed in angles, coordinates, and distances. A drawn object can be filled with color and
patterns, and we can select it as a single object. Typically, image files are compressed to save memory and
disk space; many image formats already use compression within the file itself – for example, GIF, JPEG,
and PNG.
Still images may be the most important element of our multimedia project. If we are designing
multimedia by ourself, put ourself in the role of graphic artist and lawet designer.

Bitmap Software
The abilities and feature of image-editing programs for both the Macintosh and Windows range from
simple to complex. The Macintosh does not ship with a painting tool, and Windows provides only the
rudimentary Paint (see following figure), so we will need to acquire this very important software separately
– often bitmap editing or painting programs come as part of a bundle when we purchase our computer,
monitor, or scanner.

Figure: The Windows Paint accessory provides rudimentary bitmap editing

Page 18 of 69
Capturing and Editing Images
The image that is seen on a computer monitor is digital bitmap stored in video memory, updated
about every 1/60 second or faster, depending upon monitors scan rate. When the images are assembled for
multimedia project, it may often be needed to capture and store an image directly from screen. It is possible
to use the Prt Scr key available in the keyboard to capture an image.

Scanning Images
After scanning through countless clip art collections, if it is not possible to find the unusual
background we want for a screen about gardening. Sometimes when we search for something too hard, we
don‘t realize that it‘s right in front of our face. Open the scan in an image-editing program and experiment
with different filters, the contrast, and various special effects. Be creative, and don‘t be afraid to try strange
combinations – sometimes mistakes yield the most intriguing results.

Vector Drawing
Most multimedia authoring systems provide for use of vector-drawn objects such as lines, rectangles,
ovals, polygons, and text.
Computer-aided design (CAD) programs have traditionally used vector-drawn object systems for
creating the highly complex and geometric rendering needed by architects and engineers.
Graphic artists designing for print media use vector-drawn objects because the same mathematics
that put a rectangle on our screen can also place that rectangle on paper without jaggies. This requires the
higher resolution of the printer, using a page description language such as PostScript.
Programs for 3-D animation also use vector-drawn graphics. For example, the various changes of
position, rotation, and shading of light required to spin the extruded.

How Vector Drawing Works


Vector-drawn objects are described and drawn to the computer screen using a fraction of the memory space
required to describe and store the same object in bitmap form. A vector is a line that is described by the
location of its two endpoints. A simple rectangle, for example, might be defined as follows:
RECT 0, 0,200,200

Color
Color is a vital component of multimedia. Management of color is both a subjective and a technical
exercise. Picking the right colors and combinations of colors for our project can involve many tries until we
feel the result is right.

Understanding Natural Light and Color


The letters of the mnemonic ROY G. BIV, learned by many of us to remember the colors of the
rainbow, are the ascending frequencies of the visible light spectrum: red, orange, yellow, green, blue, indigo,

Page 19 of 69
and violet. Ultraviolet light, on the other hand, is beyond the higher end of the visible spectrum and can be
damaging to humans.
The color white is a noisy mixture of all the color frequencies in the visible spectrum. The cornea of
the eye acts as a lens to focus light rays onto the retina. The light rays stimulate many thousands of
specialized nerves called rods and cones that cover the surface of the retina. The eye can differentiate among
millions of colors, or hues, consisting of combination of red, green, and blue.

Additive Color
In additive color model, a color is created by combining colored light sources in three primary
colors: red, green and blue (RGB). This is the process used for a TV or computer monitor.

Subtractive Color
In subtractive color method, a new color is created by combining colored media such as paints or ink
that absorb (or subtract) some parts of the color spectrum of light and reflect the others back to the eye.
Subtractive color is the process used to create color in printing. The printed page is made up of tiny halftone
dots of three primary colors, cyan, magenta and yellow (CMY).

Page 20 of 69
Unit – II

Sound: The Power of Sound, Digital Audio, MIDI Audio, MIDI vs. Digital Audio, Multimedia System

Sounds, Audio File Formats, Adding Sound to Our Multimedia Project.

Animation: The Power of Motion, Principles of Animation, Animation by Computer, Making Animations.

Video: Using Video, How Video Works and Is Displayed, Digital Video Containers, Obtaining Video Clips,

Shooting and Editing Video.

Page 21 of 69
Chapter – 1
Sound
Introduction
Sound is possibly the most important element of multimedia. It is meaningful ―speech‖ in any
language, from an undertone to a scream. It can provide the listening pleasure of music, the amazing
pronunciation of special effects or the feel of a mood setting background. Sound is the terminology used in
the analog form, and the digitized form of sound is called as audio.

Power of Sound
When something vibrates in the air is moving back and forward it creates wave of pressure. These
waves spread like waves from stone tossed into a still pool and when it reaches the eardrums, the change of
pressure or vibration is experienced as sound.
Sound quality is the branch of physics that studies sound. Sound pressure levels are measured in
decibels (db); a decibel measurement is actually the ratio between a chosen reference point on a logarithmic
scale and the level that is actually experienced.

Multimedia Sound Systems:

The multimedia application user can use sound right off the bat on both the
Macintosh and on a multimedia PC running Windows because beeps and warning sounds
are available as soon as the operating system is installed. On the Macintosh we can
choose one of the several sounds for the system alert. In Windows system sounds are
WAV files and they reside in the windows\Media subdirectory.
There are still more choices of audio if Microsoft Office is installed. Windows makes use
of WAV files as the default file format for audio and Macintosh systems use SND as
default file format for audio.

Digital Audio:

Digital audio is created when a sound wave is converted into numbers – a process
referred to as digitizing. It is possible to digitize sound from a microphone, a synthesizer,
existing tape recordings, live radio and television broadcasts, and popular CDs. We can
digitize sounds from a natural source or prerecorded.
Digitized sound is sampled sound. Ever nth fraction of a second, a sample of
sound is taken and stored as digital information in bits and bytes. The quality of this
digital recording depends upon how often the samples are taken.

Page 22 of 69
Preparing Digital Audio Files:

Preparing digital audio files is fairly straight forward. If we have analog source
materials – music or sound effects that we have recorded on analog media such as
cassette tapes.
 The first step is to digitize the analog material and recording it onto a computer
readable digital media.
 It is necessary to focus on two crucial aspects of preparing digital audio files:
 Balancing the need for sound quality against our available RAM and
Hard disk resources.
 Setting proper recording levels to get a good, clean recording.
Remember that the sampling rate determines the frequency at which samples will
be drawn for the recording. Sampling at higher rates more accurately captures the high
frequency content of our sound. Audio resolution determines the accuracy with which a
sound can be digitized.
Formula for determining the size of the digital audio

Monophonic = Sampling rate * duration of recording in seconds * (bit resolution / 8) * 1


Stereo = Sampling rate * duration of recording in seconds * (bit resolution / 8) * 2

 The sampling rate is how often the samples are taken.


 The sample size is the amount of information stored. This is called as bit resolution.
 The number of channels is 2 for stereo and 1 for monophonic.
 The time span of the recording is measured in seconds.

Editing Digital Recordings:

Once a recording has been made, it will almost certainly need to be edited. The
basic sound editing operations that most multimedia procedures needed are described in
the paragraphs that follow
1. Multiple Tasks: Able to edit and combine multiple tracks and then merge the
tracks and export them in a final mix to a single audio file.
2. Trimming: Removing dead air or blank space from the front of a recording and
an unnecessary extra time off the end is our first sound editing task.

Page 23 of 69
3. Splicing and Assembly: Using the same tools mentioned for trimming, we will
probably want to remove the extraneous noises that inevitably creep into
recording.
4. Volume Adjustments: If we are trying to assemble ten different recordings into
a single track there is a little chance that all the segments have the same volume.
5. Format Conversion: In some cases our digital audio editing software might
read a format different from that read by our presentation or authoring program.

6. Resampling or down sampling: If we have recorded and edited our sounds at


16 bit sampling rates but is using lower rates we must resample or down sample
the file.
7. Equalization: Some programs offer digital equalization capabilities that allow
us to modify recording frequency content so that it sounds brighter or darker.
8. Digital Signal Processing: Some programs allow we to process the signal with
reverberation, multitap delay, and other special effects using DSP routines.
9. Reversing Sounds: Another simple manipulation is to reverse all or a portion of a
digital audio recording. Sounds can produce a surreal, other wordly effect when
played backward.
10. Time Stretching: Advanced programs let we alter the length of a sound file
without changing its pitch. This feature can be very useful but watch out: most
time stretching algorithms will severely degrade the audio quality.

MIDI (Musical Instrument Digital Interface)


MIDI (Musical Instrument Digital Interface) is a communication standard developed for electronic
musical instruments and computers. MIDI files allow music and sound synthesizers from different
manufacturers to communicate with each other by sending messages along cables connected to the devices.
Creating your own original score can be one of the most creative and rewarding aspects of building a
multimedia project, and MIDI (Musical Instrument Digital Interface) is the quickest, easiest and most
flexible tool for this task.
The process of creating MIDI music is quite different from digitizing existing audio. To make MIDI
scores, however you will need sequencer software and a sound synthesizer.
The MIDI keyboard is also useful to simply the creation of musical scores. An advantage of
structured data such as MIDI is the ease with which the music director can edit the data.
A MIDI file format is used in the following circumstances:
 Digital audio will not work due to memory constraints and more processing power
requirements
 When there is high quality of MIDI source
Page 24 of 69
 When there is no requirement for dialogue.
A digital audio file format is preferred in the following circumstances:
 When there is no control over the playback hardware
 When the computing resources and the bandwidth requirements are high.
 When dialogue is required.

Audio File Formats


A file format determines the application that is to be used for opening a file.
Following is the list of different file formats and the software that can be used for opening a specific
file.
1. *.AIF, *.SDII in Macintosh Systems
2. *.SND for Macintosh Systems
3. *.WAV for Windows Systems
4. MIDI files – used by north Macintosh and Windows
5. *.WMA –windows media player
6. *.MP3 – MP3 audio
7. *.RA – Real Player
8. *.VOC – VOC Sound
9. AIFF sound format for Macintosh sound files
10. *.OGG – Ogg Vorbis

MIDI vs. Digital Audio

Digital Audio MIDI

Digital Audio refers to the reproduction and


A MIDI is a software for representing musical
transmission of sound stored in a digital
information in a digital format.
format.

Digital Representation of physical sound Abstract Representation of musical sound and


waves. sound effects.

MIDI comprises a series of commands that


Digital Audio comprises analog sound waves
represent musical notes, volume, and other musical
that are converted into a series of 0s and 1s.
parameters.

Actual Sound is stored in a digital audio file. No actual sound is stored in the MIDI file.

Files are large in size and are loose. Files are small in size and compact.

Page 25 of 69
Digital Audio MIDI

The quality of sound is in proportion to the The quality of sound is not in proportion to the file
file size. size.

They reproduce the exact sound in a digital They sound a little different from the original
format. sound.

Digital audio is used for recording and


MIDI is used for creating and controlling electronic
playback of music, sound effects, and
music, such as synthesizers and drum machines.
voiceovers.

Adding Sound to Your Multimedia Project in multimedia system

Adding sound to a multimedia project can enhance the overall experience and engagement for the
audience. Depending on the type of multimedia project you are working on, there are various ways to
incorporate sound. Here are some general guidelines and methods you can use:
1. Audio Formats:
 Ensure that your audio files are in a compatible format. Common formats include MP3, WAV,
and AAC.
 Choose a format that balances quality and file size, depending on the platform and purpose of
your multimedia project.
2. Sound Editing Software:
 Use sound editing software to create and edit your audio files. Popular tools include Audacity,
Adobe Audition, and Garage Band.
 Edit your audio files to remove any unwanted background noise, adjust volume levels, and make
other necessary enhancements.
3. Narration:
 If your multimedia project involves a presentation or storytelling, consider adding a narration
track. This can be a voiceover explaining the content or providing additional context.
 Ensure that the narration is clear, well-paced, and synchronized with the visuals.
4. Background Music:
 Background music can add atmosphere and emotion to your multimedia project. Choose music
that complements the mood and theme of your content.
 Adjust the volume levels so that the music doesn't overpower other audio elements or distract
from the main content.

Page 26 of 69
5. Sound Effects:
 Incorporate sound effects to enhance specific actions or events in your multimedia project. For
example, footsteps, doorbell rings, or applause can make your content more dynamic.
 Ensure that the sound effects are relevant and not overly loud or distracting.
6. Timing and Synchronization:
 Pay attention to the timing and synchronization of your audio elements with the visual
components. Proper synchronization enhances the overall impact of your multimedia project.
 Use timeline-based editing features in multimedia authoring tools to align sound events with
visual cues.
7. Multimedia Authoring Tools:
 Use multimedia authoring tools like Adobe Animate, Adobe Premiere, or other similar
platforms that allow you to integrate audio seamlessly with visuals.
 These tools often provide features for layering audio tracks, adjusting volume levels, and
synchronizing audio with animations.
8. Compression:
 Compress audio files appropriately to balance file size and audio quality, especially if your
multimedia project is intended for online distribution.
9. Testing:
 Test your multimedia project on different devices and platforms to ensure that the audio plays
correctly and is well-balanced across various environments.
10. Accessibility:
 Consider accessibility by providing options for subtitles or transcripts, especially if your
multimedia project includes spoken content.

Page 27 of 69
Chapter – 2
Animation
Introduction
Animation makes static presentations come alive. It is visual change over time and can add great
power to our multimedia projects. Carefully planned, well-executed video clips can make a dramatic
difference in a multimedia project. Animation is created from drawn pictures and video is created using real
time visuals.

The Power of Motion


Animation, as a dynamic form of visual representation, holds tremendous power in multimedia systems. It
adds a layer of engagement and expressiveness that static elements often lack. Here are some key aspects of
the power of motion through animation in multimedia systems:
1. Visual Engagement:
 Animation captures attention and engages viewers more effectively than static images or text.
Moving elements on the screen draw the eye and encourage users to interact with the content.
2. Storytelling:
 Animation is a powerful storytelling tool. It allows for the creation of characters, scenes, and
dynamic narratives that can convey information or emotions in a compelling way. Animated
characters can become relatable and memorable.
3. Dynamic Presentations:
 In presentations, animation helps break down complex ideas into more edible components.
Animated charts, graphs, and diagrams can improve the clarity of information and make
presentations more engaging.
4. User Interface (UI) Feedback:
 Animated UI elements provide feedback to users, making interactions more natural. For
example, button animations, loading indicators, and transition effects communicate system
responses and guide users through the interface.
5. Expressive Design:
 Animation allows for creative and expressive design choices. From subtle transitions to
elaborate visual effects, animation can convey a brand's personality and improve the overall
visual appeal of multimedia projects.
6. Educational Multimedia:
 Animation is widely used in educational multimedia to demonstrate concepts, simulate
processes, and bring historical or scientific subjects to life. Interactive animations enable
learners to engage with content actively.

Page 28 of 69
7. Advertising and Marketing:
 Animated advertisements are more attention-grabbing and memorable than static ones.
Animation in marketing materials can effectively communicate product features, promotions,
or brand messages.
8. Entertainment:
 Animation is a cornerstone of multimedia entertainment, including movies, TV shows, and
video games. Characters and scenes come to life through animation, providing immersive and
enjoyable experiences for audiences.
9. Web and Mobile Design:
 Animation is widely used in web and mobile design to create smooth transitions, enhance
navigation, and improve the overall user experience. Animated elements can guide users
through interfaces and provide visual cues.
10. Interactive Multimedia:
 Animation plays a crucial role in interactive multimedia experiences. It responds to user
input, providing feedback and creating a more dynamic and responsive environment.

Principles of Animation
Animation is the rapid display of a sequence of images of 2-D artwork or model positions in order to
create an illusion of movement. It is an optical illusion of motion due to the phenomenon of persistence of
vision, and can be created and demonstrated in a number of ways. The most common method of presenting
animation is as a motion picture or video program, although several other forms of presenting animation also
exist.
Animation is possible because of a biological phenomenon known as persistence of vision and a
psychological phenomenon called phi. An object seen by the human eye remains chemically mapped on the
eye‘s retina for a brief time after viewing. Combined with the human mind‘s need to conceptually complete
a perceived action, this makes it possible for a series of images that are changed very slightly and very
rapidly, one after the other, to seemingly blend together into a visual illusion of movement. The following
Shows a few cells or frames of a rotating logo. When the images are progressively and rapidly changed, the
arrow of the compass is perceived to be spinning.

Television video builds entire frames or pictures every second; the speed with which each frame is
replaced by the next one makes the images appear to blend smoothly into movement. To make an object
travel across the screen while it changes its shape, just change the shape and also move or translate it a few
pixels for each frame.
Page 29 of 69
Animation Techniques
When you create an animation, organize its execution into a series of logical steps. First, gather up in your
mind all the activities you wish to provide in the animation; if it is complicated, you may wish to create a
written script with a list of activities and required objects. Choose the animation tool best suited for the job.
Then build and tweak your sequences; experiment with lighting effects. Allow plenty of time for this phase
when you are experimenting and testing. Finally, post-process your animation, doing any special rendering
and adding sound effects.

i. Cel Animation
The term cel derives from the clear celluloid sheets that were used for drawing each frame, which have
been replaced today by acetate or plastic. Cels of famous animated cartoons have become sought-after,
suitable-for-framing collector‘s items. Cel animation artwork begins with keyframes (the first and last frame
of an action). For example, when an animated figure of a man walks across the screen, he balances the
weight of his entire body on one foot and then the other in a series of falls and recoveries, with the opposite
foot and leg catching up to support the body.
 The animation techniques made famous by Disney use a series of progressively different on each
frame of movie film which plays at 24 frames
 A minute of animation may thus require as many as 1,440 separate frames.
 The term cel derives from the clear celluloid sheets that were used for drawing each frame, which is
been replaced today by acetate or plastic.
 Cel animation artwork begins with keyframes.

ii. Animation by Computer


Computer animation programs typically employ the same logic and procedural concepts as cel
animation, using layer, keyframe, and tweening techniques, and even borrowing from the vocabulary of
classic animators. On the computer, paint is most often filled or drawn with tools using features such as
gradients and antialiasing. The word links, in computer animation terminology, usually means special
methods for computing RGB pixel values, providing edge detection, and layering so that images can blend
or otherwise mix their colors to produce special transparencies, inversions, and effects.
 Computer Animation is same as that of the logic and procedural concepts as cel animation and use
the vocabulary of classic cel animation – terms such as layer, Keyframe, and tweening.
 The primary difference between the animation software programs is in how much must be drawn by
the animator and how much is automatically generated by the software.
 In 2D animation the animator creates an object and describes a path for the object to follow. The
software takes over, actually creating the animation on the fly as the program is being viewed by
your user.

Page 30 of 69
 In 3D animation the animator puts his effort in creating the models of individual and designing the
characteristic of their shapes and surfaces.
 Paint is most often filled or drawn with tools using features such as gradients and anti- aliasing.

iii. Kinematics
 It is the study of the movement and motion of structures that have joints, such as a walking man.
 Inverse Kinematics is in high-end 3D programs, it is the process by which you link objects such as
hands to arms and define their relationships and limits.
 Once those relationships are set you can drag these parts around and let the computer calculate the
result.

iv. Morphing
 Morphing is popular effect in which one image transforms into another. Morphing application and
other modeling tools that offer this effect can perform transition not only between still images but
often between moving images as well.
 The morphed images were built at a rate of 8 frames per second, with each transition taking a total
of 4 seconds.
 Some product that uses the morphing features are as follows
o Black Belt‘s EasyMorph and WinImages,
o Human Software‘s Squizz
o Valis Group‘s Flo , MetaFlo, and MovieFlo.

Animation File Formats


Some file formats are designed specifically to contain animations and the can be ported among application
and platforms with the proper translators.
 Director *.dir, *.dcr
 AnimationPro *.fli, *.flc
 3D Studio Max *.max
 SuperCard and Director *.pics
 CompuServe *.gif
 Flash *.fla, *.swf
Following is the list of few Software used for computerized animation:
 3D Studio Max
 Flash
 AnimationPro

Page 31 of 69
Chapter – 3
Video
The video has become an integral part of multimedia, improving the overall user experience and
engagement. Most people today would rather watch a video tutorial on YouTube than read a step-by-step
tutorial. That‘s why every content creator – even if they create e-books, flipbooks, or other types
of multimedia presentation – should know how to use digital video files in their multimedia marketing.

Video’s impact on multimedia


Video has revolutionized multimedia, changing the way we consume and interact with information.
Streaming video is easily accessible for everyone with a smartphone – without installing additional
multimedia software. By incorporating digital video files into other media content, it becomes more
dynamic and engaging, improving the user experience.
 Video increases engagement, letting the reader spend more time viewing your content and
interacting with it in a meaningful way. You can simultaneously watch (or listen to) video
and view graphics or text content. The more time someone spends on your site, the better.
Animation can make any multimedia application attractive since it feeds on basic human
curiosity.
 Video allows you to communicate effectively, presenting your video content in a way that
is very intuitive to comprehend. When promoting a product, you can not only describe it but
show how to use it and what it offers in practice. With digital video content, the viewer can
stop and replay it at any time, to understand things better.
 Video files can create an emotional connection because the viewer sees another human
being with whom he or she can identify. Even animation creates sympathy in the reader
when you see movement and hear voices! This makes the information more memorable and
it‘s easy for the customer to remember your offer.

Analog video
Analog video refers to the representation of visual information using continuous, variable signals. In the
context of multimedia systems, analog video has been widely used in the past for various applications, such
as television broadcasting, VHS tapes, and older video cameras. Here are some key aspects of analog video
in multimedia systems:
1. Signal Representation:
 Continuous Waveform: Unlike digital video, which represents information using discrete
values (pixels), analog video uses continuous waveforms to represent the changing voltage
levels corresponding to the varying intensity of light in a scene.
 Composite Video: In many analog video systems, the entire visual information (including
color and brightness) is combined into a single composite signal.

Page 32 of 69
2. Resolution and Quality:
 Limited Resolution: Analog video typically has lower resolution compared to digital
formats. The quality of the video is influenced by factors such as the bandwidth of the signal
and the recording/playback equipment.
3. Transmission and Storage:
 Broadcasting: Analog video signals were traditionally used for broadcasting television
signals. These signals were transmitted over the air or through cable systems.
 VHS Tapes: Analog video was commonly recorded and stored on VHS tapes. The quality of
the recorded video depended on the tape format and the recording device.
4. Color Encoding:
 Analog Color Systems: Analog video systems often used different methods for encoding
color information. One common method is NTSC (National Television System Committee)
in North America, PAL (Phase Alternating Line) in Europe and parts of Asia, and SECAM
(Sequential Couleur avec Mémorisation) in some other regions.
5. Drawbacks of Analog Video:
 Signal Degradation: Analog signals are susceptible to degradation and interference during
transmission or copying, resulting in a loss of picture quality.
 Noisy Playback: Analog video playback can suffer from noise, distortion, and artifacts over
time.
6. Transition to Digital:
 Advantages of Digital Video: The transition to digital video formats has brought several
advantages, including higher resolution, better signal quality, and ease of editing and
manipulation.
 Digital Multimedia Systems: Modern multimedia systems predominantly use digital video
formats for recording, storage, and playback.

Digital video
Digital video has become the dominant format in modern multimedia systems due to its numerous
advantages over analog video. Here are key aspects of digital video in multimedia systems:
1. Representation of Information:
 Discrete Values: Digital video represents visual information using discrete values, typically
in the form of pixels. Each pixel has a specific color and brightness value, contributing to the
overall image.
2. Resolution and Quality:
 High Resolution: Digital video supports higher resolutions compared to analog, allowing for
sharper and more detailed images.

Page 33 of 69
 HD and 4K: Digital video commonly includes high-definition (HD) and ultra-high-definition
(4K) formats, providing improved clarity and visual fidelity.
3. Compression:
 Efficient Storage: Digital video can be compressed using various codecs (compression-
decompression algorithms) to efficiently store and transmit video data without significant
loss of quality.
 Streaming: Compression enables the streaming of high-quality video content over the
internet, contributing to the popularity of online video platforms.
4. Color Representation:
 RGB: Digital video often uses the RGB (Red, Green, Blue) color model to represent colors,
providing a wide and accurate range of color possibilities.
 YCbCr: YCbCr is another common color representation in digital video, separating
luminance (brightness) and chrominance (color information) components.
5. Editing and Manipulation:
 Non-destructive Editing: Digital video allows for non-destructive editing, where changes
can be made without degrading the original source. This includes cutting, merging, and
applying various effects.
 Special Effects: Digital video facilitates the incorporation of special effects, CGI (Computer-
Generated Imagery), and other post-production enhancements.
6. Storage and Distribution:
 Media Files: Digital video is typically stored as files in formats like MP4, AVI, MKV, or
others.
 Media Servers: Digital video can be easily distributed and accessed through media servers,
streaming services, and online platforms.
7. Playback Devices:
 Diverse Platforms: Digital video can be played on a wide range of devices, including
computers, smartphones, tablets, smart TVs, and dedicated media players.
 Compatibility: The standardization of digital video formats ensures broad compatibility
across different devices and software applications.
8. Multichannel Audio:
 Surround Sound: Digital video often includes multichannel audio formats, enabling
immersive surround sound experiences.
9. Interactive Features:
 Interactivity: Digital video supports interactive features, such as clickable annotations,
subtitles, and menu navigation in the case of DVDs or Blu-ray discs.

Page 34 of 69
10. Evolution and Advancements:
 3D Video: Digital video can support 3D formats for a more immersive viewing experience.
 HDR (High Dynamic Range): Digital video can incorporate HDR technology, enhancing
contrast and color for a more lifelike image.

How Video Works and Is Displayed


1. Choose the right format
2. Plan your content
3. Produce your video and multimedia
4. Promote your video and multimedia
5. Here‘s what else to consider

1. Choose the right format


Depending on your product type, target market, and goals, you can choose from a range of video and
multimedia formats to create your product demonstration. Live video is perfect for real-time interaction and
feedback with your audience, while recorded video is good for pre-recorded demonstrations that you can
edit and distribute. Animated video is great for explaining complex or abstract concepts, while interactive
video allows your audience to interact with the content. Platforms like Zoom, Facebook Live, or YouTube
Live are good for streaming live videos; tools like Camtasia, Loom, or Animoto for creating and editing
videos; Powtoon, Vyond or Toonly for animation; and HapYak, WIREWAX or Rapt Media for interactive
elements.

2. Plan your content


Once you have chosen your video and multimedia format, it's essential to plan your content carefully. To
ensure your product demonstration is clear, concise, and compelling, you should define your objective,
know your audience, craft your message, and outline your structure. When defining your objective, ask
yourself what you want to achieve with the product demonstration and if you want to educate, persuade, or
inspire your audience. Additionally, consider who you are targeting with the product demonstration - their
pain points, needs, and desires - as well as how they prefer to consume video and multimedia content. When
crafting your message, think about the main features and benefits of your product, how you differentiate
yourself from competitors, and how you address your audience's pain points and objections. Finally, when
outlining your structure, consider how to hook the audience from the start and end with a strong call to
action.

3. Produce your video and multimedia


After planning your content, you need to produce your video and multimedia with high quality and
professionalism. To achieve this, you must use the right equipment, such as cameras, microphones, lights,
tripods, or green screens. Additionally, you must follow the best practices for shooting, editing, and

Page 35 of 69
publishing your content. This includes choosing the right angle, lighting, background, sound, transitions,
effects, or captions to ensure consistency and coherence. Finally, before you launch your product
demonstration, you need to test your video and multimedia on different devices, browsers and platforms to
ensure accessibility and interactivity.

4. Promote your video and multimedia


Finally, you need to promote your video and multimedia to reach your target audience and achieve your
objective. To ensure that your product demonstration is visible, shareable, and measurable, you should
choose the right channels, optimize your video and multimedia, and analyze its performance. Depending on
the format, target market, and goals of your video and multimedia, you can choose from different channels
such as websites, blogs, social media, email, or webinars. Additionally, you should optimize your video and
multimedia for search engines, social media, or email marketing by using keywords, hashtags, titles,
descriptions, thumbnails, or CTAs. Lastly, you should analyze the performance of your video and
multimedia by looking at views, watch time, engagement, clicks, conversions or feedback. Video and
multimedia can help you create a dynamic and interactive product demonstration that showcases your
product value and engages your audience. By following these tips and best practices, you can leverage video
and multimedia to boost your product demonstration success.

5. Here’s what else to consider


This is a space to share examples, stories, or insights that don‘t fit into any of the previous sections.
What else would you like to add?

Digital Video Containers


Containers: The box holding the items is the container, keeping everything organized and transportable.
File Formats: The packaging label with instructions for handling and unpacking is like the file format,
providing structure and guidelines.
Protocols: The courier's delivery routes and procedures are like protocols, defining how packages are
transferred and delivered efficiently.

Video Formats
In practice, file extensions are used synonymously with video formats. For instance, MP4 in
"Videofile.mp4''. However, this isn‘t entirely correct.
Most file formats comprise a combination of files, folders, and playlists (TS, M3U8, etc)—which are
necessary to play a video properly.

Page 36 of 69
It is important to understand that Video Formats are different from Video File formats/File
extensions, i.e: MOV (QuickTime Movie), WMV (Windows Media Viewer), AVI (Audio Video
Interleave), MP4 (MPEG-4 Part 14), etc.
Some of the most popular video streaming formats today are MP4, MPEG-DASH, and HLS.

Video Containers
File extensions for video files actually represent Containers—which contain the entire gamut of files
required to play a video. This information includes the metadata and video & audio stream.
 The video stream is to instruct the video player as to what should be displayed on the screen,
whereas—
 The audio stream ensures the right sound is played for the specific video.

Page 37 of 69
 The metadata, or ―data about data‖, comprises a slew of information on the video file, for instance,
its resolution, date of creation or modification, bit-rate type, subtitles, and so on.

Video Codecs
As is evident, the codec is a combination of words resulting from the coder and decoder.
Codecs encode video or audio streams to create more manageable and streamable sizes of video and audio
files.
The video player or platform on which the video is played then decodes it depending on the
information contained in that codec and plays back the video while maintaining the quality of the original.
Similar to containers, there is a slew of different codecs in existence today to be used with different audio
and video files—some of which include H.264, H.265, VP9, AAC, MP3, and so on.

Best Video Codecs


H.264 (AVC)
H.264 is a widely adopted video codec known for its efficient compression and broad compatibility.
Pros:
 Achieves good video quality at lower bitrates, making it suitable for various applications.
 Many devices have dedicated hardware support for H.264, improving playback efficiency.
Cons:
 While efficient, H.264 is surpassed by H.265 in terms of compression efficiency.
 Implementing H.264 in certain applications may involve licensing fees.
Use Case: Commonly used for streaming, video conferencing, online video, and various multimedia
applications.
H.265
H.265 is an advanced video codec designed for improved compression efficiency compared to H.264.
Pros:
 Well-suited for high-resolution content, including 4K and HDR video.
 H.265 achieves higher compression efficiency, resulting in smaller file sizes or improved quality at the
same bitrate.

Page 38 of 69
Cons:
 Decoding H.265 may require more computational power compared to H.264.
 Some implementations of H.265 may involve licensing fees.
Use Case: Ideal for applications requiring high compression efficiency, such as streaming UHD content and
video surveillance.
VP9
VP9 is an open and royalty-free video codec developed by Google as a successor to VP8.
Pros:
 Achieves good video quality at lower bitrates, similar to H.265.
 Support for VP9 is present in devices, improving playback efficiency.
Cons:
 VP9 may not achieve the same level of compression efficiency as newer codecs like AV1, resulting in
larger file sizes compared to more advanced alternatives.
 VP9 decoding requires more computational power compared to older codecs like H.264, potentially
leading to performance issues on less powerful devices.
Use Case: Commonly used for streaming high-quality videos on the web, particularly in platforms that
prioritize royalty-free codecs.
AV1
AV1 is an open and royalty-free video codec developed by the Alliance for Open Media (AOMedia).
Pros:
 Designed to provide advanced compression efficiency, potentially surpassing H.265 and VP9.
 AV1 is an open standard with no licensing fees, making it cost-effective for content creators.

Cons:
 AV1 encoding is slow compared to some other codecs, which may impact the efficiency of the content
creation process, especially for real-time applications.
 AV1's widespread adoption is still in progress, and some platforms or devices may not fully support this
codec, leading to compatibility challenges.
Use Case: Emerging as a preferred codec for high-quality streaming and online video services due to its
efficiency and royalty-free nature.
Common Video Formats Lists
MP4
MP4 is a versatile video format widely used for its compatibility across devices and platforms.
Pros:
 Utilizes efficient codecs like H.264 for effective video compression without sacrificing quality.
 Allows embedding metadata such as subtitles, multiple audio tracks, and chapter information.
Page 39 of 69
Cons:
 MP4 files can be less editable compared to some other formats like AVI.
 Excessive compression may lead to noticeable quality degradation.
Use Case: Ideal for sharing videos online, streaming, and playing on a diverse range of devices.
MOV
MOV is a multimedia container format developed by Apple for high-quality video playback.
Pros:
 Commonly associated with Apple devices, MOV supports high-quality video and audio playback.
 Allows the use of various video and audio codecs, providing versatility in content creation.
Cons:
 Uncompressed MOV files can be large, requiring significant storage space.
 MOV files may face compatibility issues on non-Apple devices.
Use Case: Well-suited for high-quality video production, editing, and playback within the Apple ecosystem.
AVI
AVI is a multimedia container format developed by Microsoft for storing video and audio data.
Pros:
 Supports various video and audio codecs, providing flexibility in content creation.
 AVI files generally have lower compression overhead, preserving original video quality.
Cons:
 Compared to newer formats, AVI has limited support for embedded metadata.
 Uncompressed or less compressed AVI files can result in large file sizes.
Use Case: Suitable for video editing, as it allows for minimal compression and maintains high-quality
source footage.
WebM
WebMwith is an open, royalty-free multimedia container format designed for efficient web video streaming.
Pros:
 WebM is an open standard with no licensing fees, making it cost-effective for content creators.
 Utilizes VP9 and AV1 codecs for efficient compression, enabling high-quality streaming.
Cons:
 Not as widely used in commercial video production compared to other formats.
 Some older devices and software may not fully support WebM, leading to compatibility issues.
Use Case: Optimized for streaming high-quality videos on the web, especially in scenarios where royalty-
free formats are preferred

Page 40 of 69
Shooting and editing video
To add full-screen, full-motion video to your multimedia project, you will need to invest in
specialized hardware and software or purchase the services of a professional video production studio. In
many cases, a professional studio will also provide editing tools and post-production capabilities
The video for windows is an external set of software works along with multimedia extension for
windows. It has the feature for digitized video recording, playback and Editing. The video cap utility of this
software is used to capture the video and audio clips using external hardware. The captured sequence can be
viewed in a number of different size and speed and also different colour palates can be created for individual
frame. Video for windows has four different types of editing features named as Video Edit, PalEdit, Waved
it and BitEdit. As the name suggests
 Video Edit is used to cut and paste captured video segments together,
 Wav Edit is the feature which work with the recorded digital audio and helps you to edit it.
 Pal Edit is the work with the colour plates within the captured video to improve the colour,
 Bit Edit helps clean up the rough patches in the images. It also has the interface to the media
control panel to control digital video files.
1. Compositions: Composition is at the heart of making attractive video, because it focuses not on
things like story line and plot development, or even the more technical issues of color balance,
lighting and audio levels. Rather, composition is all about the placement of your subject(s) in the
frame so that the effect is as pleasing to the eye as possible.
2. Video Compression: Video takes up a lot of space. Uncompressed recording from a camcorder
takes up about 17MB per second of video. Because it takes up so much space, video must be
compressed before it is used. ―Compressed‖ means that the information is packed into a smaller
space. There are two kinds of compression: lossy and lossless.
3. Lossy compression: Lossy compression means that the compressed file has less data in it than
the original file. In some cases this translates to lower quality files, because information has been
―lost‖. Lossy compression makes up for the loss in quality by producing comparatively small files.
For example, DVDs are compressed using the MPEG-2 format, which can make files 15 to 30 times
smaller, but we still tend to perceive DVDs as having high-quality picture.
4. Lossless compression: Lossless compression is exactly what it sounds like, compression where
none of the information is lost. This is not nearly as useful because files often end up being the same
size as they were before compression as reducing the file size is the primary goal of compression.
However, if file size is not an issue, using lossless compression will result in a perfect-quality
picture. For example, a video editor transferring files from one computer to another using a hard
drive might choose to use lossless compression to preserve quality while he or she is working.

Page 41 of 69
5. Lighting: Perhaps the greatest difference between professional camcorders and consumer
camcorders is their ability to perform at low light levels. Using a simple floodlight kit, or even just
being sure that daylight illuminates the room, can improve your image. Onboard battery lights for
camcorders can be useful but only in conditions where the light acts as a "fill light" to illuminate the
details of a subject's face. The standard lighting arrangement of a studio is displayed with fill, key,
rim, and background lights. Changing any of these lights can make a dramatic difference in the shot.
6. Chroma keys: Chroma keys allow you to choose a color or range of colors that become
transparent, allowing the video image to be seen 'through" the computer image. This is the
technology used by a newscast's weather person, who is shot against a blue background that is made
invisible when merged with the electronically generated image of the weather map.
7. Blue screen: Blue screen is a popular technique for making multimedia titles because expensive
sets are not required. Incredible backgrounds can be generated using 3-D modeling and graphic
software, and one or more actors, vehicles, or other objects can be neatly layered onto that
background.

Page 42 of 69
Unit – III

Making Multimedia: The Stages of a Multimedia Project, the Intangibles, Hardware, Software, Authoring

Systems.

Designing and producing: designing the structure, designing the user interface, a multimedia design case

history, producing.

Page 43 of 69
CHAPTER 1: MAKING MULTIMEDIA
Definition: How do you make multimedia?
 Multi- media is made, with guidance and suggestions for getting started, and learn about
planning a project. Learn about producing, managing, and designing a project; getting material
and content; testing the work; and, ultimately, shipping it to end users or posting it to the
Web.

The Stages of a Multimedia Project


 Most multimedia and web projects must be undertaken in stages.
 Some stages should be completed before other stages begin, and some stages may be
skipped or combined. Here are the four basic stages in a multimedia project:
1. Planning and costing
2. Designing and producing
3. Testing
4. Delivering
1. Planning and costing
 A project always begins with an idea or a need that you then refine by outlining its
messages and objectives.
 Identify how make each message and objective work within your authoring system.
 Before to begin developing, plan out the writing skills, graphic art, music, video, and
other multimedia expertise that require.
 Develop a creative ―look and feel‖ (what a user sees on a screen and how he or she
interacts with it), as well as a structure and a navigational system that will allow the
viewer to visit the messages and content.
 The more time you spend getting a handle on project by defining its content and
structure in the beginning, the faster later build it, and the less reworking and
rearranging will be required midstream.
2. Designing and producing
 Perform each of the planned tasks to create a finished product.
 During this stage, there may be many feedback cycles with a client until the client is
happy.
3. Testing
 Testing the programs to make sure that they meet the objectives of the project,
work properly on the intended delivery platforms, and meet the needs of client or
end user.

Page 44 of 69
4. Delivering

 Package and deliver the project to the end user.


 Be prepared to follow up over time with tweaks, repairs, and upgrades.

What You Need: The Intangibles


 Multimedia need hardware, software, and good ideas to make multimedia.
 To make good multimedia, need talent and skill.
 Need to stay organized, because as the construction work gets under way, all the little bits and
pieces of multimedia content—the six audio recordings of Alaskan Eskimos
 Need time and money and need to budget these precious commodities
 Need the help of other people
 Multimedia development of any scale greater than the most basic level is inherently a team
effort: artwork is performed by graphic artists, video shoots by video producers, sound editing
by audio producers, and programming by programmers.
Creativity
 Before beginning a multimedia project, must first develop a sense of its scope and content.
 Taking inspiration from earlier experiments, developers modify and add their own creative
touches for designing their own unique multimedia projects.
 It is very difficult to learn creativity.
Organization
 It‘s essential that develop an organized outline and a plan that rationally details the skills, time,
budget, tools, and resources will need for a project.
 These should be in place before start to render graphics, sounds, and other components, and a
protocol should be established for naming the files so can organize them for quick retrieval
when they need them.
 These files—called assets—should continue to be monitored throughout the project‘s execution.
Communication
 Many multimedia applications are developed in workgroups comprising instructional designers,
writers, graphic artists, programmers, and musi- cians located in the same office space or
building.
 The workgroup members‘ computers are typically connected on a local area network (LAN).
 The client‘s computers, however, may be thousands of miles distant, requiring other methods for
good communication.
 Communication among workgroup members and with the client is essential to the efficient and
accurate completion of the project.
Page 45 of 69
If the client and system both connected to the Internet, a combination of Skype video and voice
telephone, e-mail, and the File Transfer Protocol (FTP) may be the most cost-effective and efficient solution
for both creative development and project management.

 In the workplace, use quality equipment and software for your communications setup. The
cost—in both time and money—of stable and fast networking.

What You Need: Hardware


 The two most significant platforms for producing and delivering multimedia projects: the Apple
Macintosh operating system (OS) and the Microsoft Windows OS found running on most Intel-
based PCs (including Intel-based Macintoshes).
 These computers, with their graphical user interfaces and huge installed base of many millions of
users throughout the world, are the most commonly used platforms for the development and
delivery of today‘s multimedia.
 Detailed and animated multimedia is also created on specialized workstations from Silicon
Graphics, Sun Microsystems, and even on mainframes, but the Macintosh and the Windows PC
offer a compelling combination of affordability, software availability, and worldwide obtain-
ability.
 A graphic image is still a graphic image, and a digitized sound is still a digitized sound,
regardless of the methods or tools used to make and display it or to play it back.
 Many software tools readily convert picture, sound, and other multimedia files from Macintosh
to Windows format, and vice versa, using known file formats or even binary compatible files that
require no conversion at all.
 While there is a lot of talk about platform-independent delivery of multimedia on the Internet,
with every new version of a browser there are still annoying failures on both platforms.
 These failures in cross- platform compatibility can consume great amounts of time as you
prepare for delivery by testing and developing workarounds and tweaks so project performs
properly in various target environments.
Table shows the penetration of operating systems.

Windows Mac Other


90.76% 4.32% 4.92%
Table Worldwide Operating System Market Share in September, 2010
Windows vs. Macintosh
 A Windows computer is not a computer per se, but rather a collection of parts that are tied
together by the requirements of the Windows operating system.
 Power supplies, processors, hard disks, CD-ROM and DVD players and burners, video and
audio components, monitors, keyboards, mice, WiFi, and Bluetooth transceivers—it doesn‘t
Page 46 of 69
matter where they come from or who makes them.
 Made in Texas, Taiwan, Indonesia, India, Ireland, Mexico, or Malaysia by widely known or
little- known manufacturers, these components are assembled and branded by Dell, HP, Sony, and
others into computers that run Windows.
Networking Macintosh and Windows Computers
 Working in a multimedia development environment consisting of a mixture of Macintosh and
Windows computers, to communicate with each other and also wish to share other resources
among them, such as printers.
 Local area networks (LANs) and wide area networks (WANs) can connect the members of a
workgroup.
 In a LAN, workstations are usually located within a short distance of one another, on the same
floor of a building,
 For example. WANs are communication systems spanning greater distances, typically set up
and managed by large corporations and institutions for their own use, or to share with other
users.
 LANs allow direct communication and sharing of peripheral resources such as file servers,
printers, scanners, and network routers.
 They use a variety of proprietary technologies to perform the connections, most commonly
Ethernet (using twisted-pair copper wires) and WiFi (using radio).
 Ethernet is only a method for wiring up computers, so you still will need client/server software
to enable the computers to speak with each other and pass files back and forth. The Windows
and Mac operating systems provide this networking software, but you may need expert help to
set it up—it can be complicated!
Connections
 The equipment required for developing your multimedia project will depend on the content of
the project as well as its design.
 Certainly need as fast a computer as you can lay your hands on, with lots of RAM and disk
storage space. Table 7-2 shows various device connection methodologies and their data transfer
rates.
 Content such as sound effects, music, graphic art, clip animations, and video to use in your
project, may not need extra tools for making your own.
 However, multimedia developers have separate equipment for digitizing sound from tapes or
microphone, for scanning photographs or other printed matter, and for making digital still or
movie images.

Page 47 of 69
Connection Transfer Rate
Serial port 115 Kbps (0.115 Mbps)
Standard parallel port 115 Kbps (0.115 Mbps)
USB (Original 1.0) 12 Mbps (1.5 Mbps)
SCSI-2 (Fast SCSI) 80 Mbps
SCSI (Wide SCSI) 160 Mbps
Ultra2 SCSI 320 Mbps
FireWire 400 (IEEE 1394) 400 Mbps
USB (Hi-Speed 2.0) 480 Mbps
SCSI (Wide Ultra2) 640 Mbps
FireWire 800 (IEEE 1394) 800 Mbps
SCSI (Wide Ultra3) 1,280 Mbps
SATA 150 1,500 Mbps
SCSI (Ultra4) 2,560 Mbps
SATA 300 3,000 Mbps
FireWire 3200 (IEEE 1394) 3,144 Mbps
USB (Super-Speed 3.0) 3,200 Mbps
SCSI (Ultra5) 5,120 Mbps
SATA 600 6,000 Mbps
Fibre Channel (Optic) 10,520 Mbps
Table Maximum Transfer Rates for Various Connections in Megabits Per Second
SCSI
 The Small Computer System Interface (SCSI—pronounced ―scuzzy‖) adds peripheral
equipment such as disk drives, scanners, CD-ROM play-ers, and other peripheral devices that
conform to the SCSI stan- dard.
 SCSI connections may connect internal devices such as hard drives that are inside the chassis of
computer‘s power supply, and external devices, which are outside the chasis are plugged into
the computer.
IDE, EIDE, Ultra IDE, ATA, and Ultra ATA
 Integrated Drive Electronics (IDE) connections, also known as Advanced Technology
Attachment (ATA) connections, are typically only internal, and they connect hard disks, CD-
ROM drives, and other peripherals mounted inside the PC.
 With IDE controllers, you can install a combination of hard disks, CD-ROM drives, or other
devices in your PC.

Page 48 of 69
 The circuitry for IDE is typically much less expensive than for SCSI, but comes with some
limitations. For example, IDE requires time from the main processor chip, so only one drive in a
master/slave pair can be active at once.
USB
 A consortium of industry players including Compaq, Digital Equipment, IBM, Intel, Microsoft,
NEC, and Northern Telecom was formed in 1995.
 To promote a Universal Serial Bus (USB) standard for connecting devices to a computer.
These devices are automatically recognized (―plug-and- play‖) and installed without users
needing to install special cards or turn the computer off and on when making the connection
(allowing ―hot- swapping‖).
 USB technology has improved in performance.
 USB uses a single cable to connect as many as 127 USB peripherals to a single personal computer.
 Hubs can be used to ―daisy- chain‖ many devices. USB connections are now common on video
game consoles, cameras, GPS locators, cell phones, televisions, MP3 players, PDAs, and
portable memory devices.
Memory and Storage Devices
 Add more memory and storage space to your computer, gigabyte hard disk;
 To estimate the memory requirements of a multimedia project—the space required on a hard
disk, thumb drive, CD-ROM, or DVD, not the random access memory (RAM) used while your
computer is running— you must have a sense of the project‘s content and scope.
 Color images, text, sound bites, video clips, and the programming code that glues it all together
require memory
 Making multimedia, you will also need to allocate memory for storing and archiving working files
used during production, original audio and video clips, edited pieces, and final mixed pieces,
production paperwork and correspondence, and at least one backup of your project files, with a
second backup stored at another location.
Random Access Memory (RAM)
 Faced with budget constraints, you can certainly produce a multimedia project on a slower or
limited- memory computer.
 Fast processor with- out enough RAM may waste processor cycles while it swaps needed
portions of program code into and out of memory.
 Increasing available RAM may show more performance improvement on the system than
upgrading the processor chip.
Read-Only Memory (ROM)
 RAM, read-only memory (ROM) is not volatile.

Page 49 of 69
 When turn off the power to a ROM chip, it will not forget, or lose its memory.
 ROM is typically used in computers to hold the small BIOS program that initially boots
up the computer, and it is used in printers to hold built-in fonts.
Hard Disks
 Adequate storage space for your production environment can be provided by large-capacity hard
disks, server-mounted on a network.
 As multi- media has reached consumer desktops, makers of hard disks have built smaller-
profile, larger-capacity, faster, and less-expensive hard disks.
 As network and Internet servers drive the demand for centralized data storage requiring terabytes
(one trillion bytes), hard disks are often configured into fail-proof redundant arrays offering
built-in protection against crashes.
CD-ROM Discs
 Compact disc read-only memory (CD-ROM) players have become an integral part of the
multimedia development workstation and are an important delivery vehicle for mass-produced
projects.
 A wide variety of developer utilities, graphic backgrounds, stock photography and sounds,
applications, games, reference texts, and educational software are available on this medium.
 CD-ROM players have typically been very slow to access and transmit data (150 KBps, which
is the speed required of consumer Audio CDs), but developments have led to double-, triple-,
quadruple- speed, 24x, 48x, and 56x drives designed specifically for computer (not Red Book
Audio) use.
 With a compact disc recorder, you can make your own CDs, using CD-recordable (CD-R) blank
discs to create a CD in most formats of CD-ROM and CD-Audio.
 A CD-RW (read and write) recorder can rewrite 700MB of data to a CD-RW disc about 1,000
times.
Digital Versatile Discs (DVD)
 In December 1995, nine major electronics companies (Toshiba, Matsushita, Sony, Philips, Time
Warner, Pioneer, JVC, Hitachi, and Mitsubishi Electric) agreed to promote a new optical disc
technology for distribution of multimedia and feature-length movies called Digital Versatile
Disc (DVD)
 With a DVD capable not only of gigabyte storage capacity but also full-motion video (MPEG2)
and high-quality audio in surround sound, this is an excellent medium for delivery of
multimedia projects
 There are three types of DVD, including
o DVD-Read Write,

Page 50 of 69
o DVD-Video, and
o DVD-ROM.
These types reflect marketing channels, not the technology.
DVD Feature DVD Specification Blu-ray Specification
Disc diameter 120 mm (5 inches) 120 mm (5 inches)
Disc thickness 1.2 mm (0.6 mm thick disc × 2) 1.2 mm (0.6 mm thick disc
× 2)
Memory capacity 4.7 gigabytes/single side 25 gigabytes/single layer
Wave length of laser diode 650 nanometer/635 nanometer (red) 405 nanometer
violet) (blue-
Data transfer rate 1x Variable speed data transfer at an average rate of 4.69 Mbps for
and sound image Variable speed data transfer at an average rate of 36
and sound Mbps for image
Image compression MPEG2 digital image compression MPEG-2 Part 2,
AVC, and SMPTE H.264/MPEG-4
VC-1
Audio Dolby AC-3 (5.1 ch), LPCM for NTSC and Dolby Digital
3), DTS, and linear PCM (AC-
MPEG Audio, LPCM for PAL/SECAM (a maxi-
mum of 8 audio channels and 32 subtitle
channels can be stored)
Running time (movies) Single Layer (4.7GB): 133 minutes a side (at an average
data rate of 4.69 Mbps for image and sound, including three audio channels and four subtitle
channels) Single Layer (25GB): Encoded using MPEG-2 video, about
two hours of HD content; using VC-1 or MPEG-4 AVC codecs, about 4 hours of HD quality video
and audio
What You Need: Software
 Multimedia software tells the hardware what to do.
 Display the color red. Move that tiger three leaps to the left. Slide in the words ―Now You‘ve
Done It!‖ from the right and blink them on and off.
 Play the sound of cymbals crashing. Run the digitized trailer for Avatar. Turn down the volume
on that MP3 file!
 The basic tool set for building multimedia projects contains one or more authoring systems and
various editing applications for text, images, sounds, and motion video.
 A few additional applications are also useful for capturing images from the screen, translating
Page 51 of 69
file for- mats, and moving files among computers.
 The tools used for creating and editing multimedia elements on both Windows and Macintosh
platforms do image processing and editing, drawing and illustration, 3-D and CAD, OCR and
text editing, sound recording and editing, video and moviemaking, and various utilitarian
housekeeping tasks.
Text Editing and Word Processing Tools
 A word processor is usually the first software tool computer users learn.
 From letters, invoices, and storyboards to project content, word processor may also be most often
used tool, as the design and build a multimedia project.
 The better keyboarding or typing skills, the easier and more efficient your multimedia day-to-
day life will be.
 An office or workgroup will choose a single word processor to share documents in a standard
format. And most often, that word procWord processors such as Microsoft Word and
WordPerfect are power- ful applications that include spell checkers, table formatters,
thesauruses, and prebuilt templates for letters, résumés, purchase orders, and other common
documents.
 Many developers have begun to use OpenOffice (www.openoffice.org) for word processing,
spreadsheets, presentations, graphics, databases, and more.
 It can be downloaded and used completely free of charge for any purpose and is available in
many languages.
 It can read and write files from other, more expensive, office packages. In many word processors,
can embed multimedia elements such as sounds, images, and video.
OCR Software
 OCR software turns bitmapped characters into electronically recognizable ASCII text.
 A scanner is typically used to create the bitmap. Then the software breaks the bitmap into
chunks according to whether it contains text or graphics, by examining the texture and density
of areas of the bitmap and by detecting edges.
 The text areas of the image are then converted to ASCII characters using probability and expert
system algorithms.
 Most OCR applications claim about 99 percent accuracy when reading 8- to 36-point printed
characters at 300 dpi and can reach processing speeds of about 150 characters per second.
 These programs do, how- ever, have difficulty recognizing poor copies of originals where the
edges of characters have bled; these and poorly received faxes in small print may yield more
recognition errors than it is worthwhile to correct after the attempted recognition.

Page 52 of 69
Painting and Drawing Tools
 Painting and drawing tools, as well as 3-D modelers, are perhaps the most important items in
your toolkit because, of all the multimedia elements, the graphical impact of your project will
likely have the greatest influence on the end user.
 Painting software, such as Photoshop, Fireworks, and Painter, is dedicated to producing
crafted bitmap images.
 Drawing software, such as CorelDraw, FreeHand, Illustrator, Designer, and Canvas, is
dedicated to producing vector-based line art easily printed to paper at high resolution.
 Some vector-based packages such as Macromedia‘s Flash are aimed at reducing file download
times on the Web and may contain both bitmaps and drawn art.
Look for these features in a drawing or painting package:
 An intuitive graphical user interface with pull-down menus, status bars, palette control, and dialog
boxes for quick, logical selection.
 Scalable dimensions, so that you can resize, stretch, and distort both large and small bitmaps
 Paint tools to create geometric shapes, from squares to circles and from curves to complex
polygons
 The ability to pour a color, pattern, or gradient into any area
 The ability to paint with patterns and clip art
 Customizable pen and brush shapes and sizes
 An eyedropper tool that samples colors
 An autotrace tool that turns bitmap shapes into vector-based outlines
 Support for scalable text fonts and drop shadow
 Multiple undo capabilities, to let you try again history function for redoing effects, drawings, and
text
 A property inspector
 A screen capture facility
 Painting features such as smoothing coarse-edged objects into the back- ground with anti-
aliasing (see illustration); airbrushing in variable sizes, shapes, densities, and patterns; washing
colors in gradients; blending; and masking.
 Support for third-party special-effect plug-ins
 Object and layering capabilities that allow you to treat separate elements independently
 Zooming, for magnified pixel editing
 All common color depths: 1-, 4-, 8-, and 16-, 24-, or 32-bit color, and gray-scale
 Good color management and dithering capability among color depths using various color models
such as RGB, HSB, and CMYK
 Good palette management when in 8-bit mode
Page 53 of 69
 Good file importing and exporting capability for image formats such as PIC, GIF, TGA, TIF,
PNG, WMF, JPG, PCX, EPS, PTN, and BMP.
Animation, Video, and Digital Movie Tools
 Animations and digital video movies are sequences of bitmapped graphic scenes (frames),
rapidly played back.
 But animations can also be made within the authoring system by rapidly changing the location
of objects, or sprites, to generate an appearance of motion. Most authoring tools adopt either a
frame- or object-oriented approach to animation, but rarely both.
 To make movies from video, you may need special hardware to convert an analog video signal to
digital data. Macs and PCs with FireWire (IEEE 1394) or USB ports can import digital video
directly from digital camcorders.
 Moviemaking tools such as Premiere, Final Cut Pro, VideoShop, and MediaStudio Pro let to edit
and assemble video clips captured from camera, tape, other digitized movie segments, animations,
scanned images, and from digitized audio or MIDI files. The completed clip, often with added
transition and visual effects, can then be played back—either stand- alone or windowed within
your project

What You Need: Authoring Systems


 Multimedia authoring tools provide the important framework you need for organizing and
editing the elements of the multimedia project, including graphics, sounds, animations, and
video clips.
 Authoring tools are used for designing interactivity and the user interface, for presenting
project on screen, and for assembling diverse multimedia elements into a single, cohesive
product.
 Authoring software provides an integrated environment for binding together the content and
functions of the project, and typically includes everything you need to create, edit, and import
specific types of data; assemble raw data into a playback sequence or cue sheet; and provide a
structured method or language for responding to user input.
With multi- media authoring software, we can make

 Video productions
 Animations
 Games
 Interactive web sites
 Demo disks and guided tours
 Presentations
 Kiosk applications
Page 54 of 69
 Interactive training
 Simulations, prototypes, and technical
visualizations Helpful Ways to get Started
Consider the following tips for making production work go smoothly:

Use templates that people have already created to set up your production. These can include
appropriate styles for all sorts of data, font sets, color arrangements, and particular page setups that will
save you time.
 Use wizards when they are available—they may save much time and pre-setup work.
 Use named styles, because if take the time to create our own it will really slow down.
Unless your client specifically requests a particular style, you will save a great deal of
time using something already created, usable, and legal.
 Create tables, which can build with a few keystrokes in many pro- grams, and it makes
the production look credible.
 Help readers find information with tables of contents, running headers and footers, and
indexes.
 Improve document appearance with bulleted and numbered lists and symbols.
 Allow for a quick-change replacement using the global change feature.
 Reduce grammatical errors by using the grammar and spell checker provided with the
software. Do not rely on that feature, though, to set all things right— still need to
proofread everything.
Making Instant Multimedia
 Common desktop tools have become multimedia-powerful.
 Some multimedia projects may be so simple that you can cram all the organizing, planning,
rendering, and testing stages into a single effort, and make ―instant‖ multimedia. We get many
more ways to effectively convey your message than just a slide show.
Types of Authoring Tools
 Each multimedia project undertake will have its own underlying structure and purpose
and will require different features and functions.
 E-learning modules such as those seen on PDAs, MP3 players, and intra-college
networks may include web-based teaching materials, multi- media CD-ROMs or web
sites, discussion boards, collaborative software, wikis, simulations, games, electric
voting systems, blogs, computer- aided assessment, simulations, animation, blogs,
learning management software, and e-mail.
 This is also referred to as distance learning or blended learning, where online learning
is mixed with face-to-face learning.

Page 55 of 69
The various multimedia authoring tools can be categorized into three groups, based on the method used
for sequencing or organizing multi- media elements and events:
 Card- or page-based tools
 Icon-based, event-driven multimedia and game-authoring tools
 Time based tools

Card- and Page-Based Authoring Tools


 Card-based or page-based tools are authoring systems, wherein the elements are organized as pages
of a book or a stack of cards
 Thousands of pages or cards may be available in the book or stack. These tools are best used
when the bulk of your content consists of elements that can be viewed individually, letting the
authoring system link these pages or cards into organized sequences.
 Jump, on command, to any page you wish in the structured navigation pattern.
 Page-based authoring systems such as LiveCode from Runtime Revolution (www.runrev.com)
and ToolBook (www.toolbook.org) contain media objects: buttons, text fields, graphic objects,
backgrounds, pages or cards, and even the project itself.
 The characteristics of objects are defined by properties (highlighted, bold, red, hidden, active,
locked, and so on).
 Each object may contain a programming script, usually a property of that object, activated when
an event (such as a mouse click) related to that object occurs.
 Events cause messages to pass along the hierarchy of objects in the project; for example, a
mouse- clicked message could be sent from a button to the background, to the page, and then to
the project itself.
Time-Based Authoring Tools
o Time-based tools are authoring systems, wherein elements and events are organized along a
timeline, with resolutions as high as or higher than 1/30 second.
o Time-based tools are best to use when a message with a beginning and an end have.
o Sequentially organized graphic frames are played back at a speed that you can set.
o Other elements (such as audio events) are triggered at a given time or location in the sequence of
events.
o The more powerful time-based tools, program jumps to any location in a sequence, thereby adding
navigation and interactive control.
o Each tool uses its own distinctive approach and user interface for managing events over time.
o Many use a visual timeline for sequencing the events of a multimedia presentation, often displaying
layers of various media elements or events alongside the scale in increments as precise as one second.

Page 56 of 69
Choosing an Authoring Tool
 Authoring tools are constantly being improved by their makers, who add new features and
increase performance with upgrade development cycles of six months to a year.
 It is important that you study the software product reviews in the blogs and computer trade
journals, as well as talk with current users of these systems, before deciding on the best ones for
your needs. Here‘s what to look for
Editing Features
 The elements of multimedia—images, animations, text, digital audio and MIDI music, and
video clips—need to be created, edited, and converted to standard file formats, using the
specialized applications
 Also, editing tools for these elements, particularly text and still images, are often included in
your authoring system.
 The editors that may come with an authoring system will offer only a subset of the substantial
features found in dedicated tools.

Playback Features
As you build your multimedia project, you will be continually assembling elements and testing to see
how the assembly looks and performs. Your authoring system should let you build a segment or part of
your project and then quickly test it as if the user were actually using it. You should spend a great deal
of time going back and forth between building and testing as you refine and smooth the content and
timing of the project. You may even want to release the project to others who you trust to run it ragged
and show you its weak points.

Why need for authoring system


Delivery Features
 Delivering your project may require building a run-time version of the project using the
multimedia authoring software.
 A run-time version or standalone allows your project to play back without requiring the full
authoring software and all its tools and editors.
 Often, the run-time version does not allow users to access or change the content, structure,
and programming of the project.
Cross-Platform Features
 It is also increasingly important to use tools that make transfer across platforms easy. For many
developers, the Macintosh remains the multimedia authoring platform of choice, but 80 percent
of that developer‘s target market may be Windows platforms.
 Develop on a Macintosh, look for tools that provide a compatible authoring system for
Windows or offer a run-time player for the other platform.
Page 57 of 69
Internet Playability
 Because the Web has become a significant delivery medium for multimedia, authoring systems
typically provide a means to convert their output so that it can be delivered within the context of
HTML or DHTML, either with special plug-ins or by embedding Java, JavaScript, or other code
structures in the HTML document.
Test your authoring software for Internet delivery before you build your project. Be sure it performs
on the Web as you expect! Test it out for performance stability on as many platforms as you can
CHAPTER 2: MULTIMEDIA SKILLS
Multimedia Production Team
Multimedia Skills
 Video producers become experts with computer-generated animations and MIDI controls
for their edit suites.
 Architects become bored with two-dimensional drafting and create three-dimensional animated
walk- throughs
 Oil field engineers get tired of manipulating complex data sets and design mouse-driven human
interfaces.
 Classical painters learn the electronic elements of red, green, and blue and create fantastic,
computer- based artwork.
 A multimedia developer might be any or all of these and typically doesn‘t fit a traditional
management information system (MIS) or computer science mold; many have never seen a line
of C++ code or booted up a Linux server. Perhaps, in the broadest definition, multimedia
developers might simply be called information technology workers.
 A multimedia production team may require as many as 18 discrete roles, including:
o Executive Producer Producer/Project Manager
o Creative Director/Multimedia Designer Art Director/Visual Designer
o Artist
o Interface Designer Game Designer Subject Matter Expert
o Instructional Designer/Training Specialist Scriptwriter
o Animator (2-D/3-D) Sound Producer Music Composer Video Producer
o Multimedia Programmer HTML Coder Lawyer/Media Acquisition Marketing Director
Project Manager
 A project manager‘s role is at the center of the action. He or she is responsible for the overall
development and implementation of a project as well as for day-to-day operations.
 Budgets, schedules, creative sessions, time sheets, illness, invoices, and team dynamics—the
project manager is the glue that holds it together.
Page 58 of 69
Project Manager/Interface Expert
 Multimedia company looking to immediately fill position working on interactive television
project for major telecommunications company.
 Project manager needed to manage production and design efforts on large-scale, interactive
television project for air in United States.
o Must be adept and experienced at managing complex projects, preferably with large
corporate accounts.
o Must have solid understanding of interactivity and experience with interactive media in
the broadcast television world.
o Must have several years of experience with interface design or have worked in
management of an interface design group
o Must have good design sensibilities.
o Communication skills a must; candidate must be an articulate and effective
communicator, an excellent listener, and should be able to act as a conduit for the
information passing between our team and the client‘s teams.
o Superior attention to detail and ability to coordinate large amounts of information a must
o Prefer entertainment experience—ideally, television or video production.
o Solid computer or digital media experience and knowledge a must.
o Travel required for visiting focus groups and gathering consumer information.
o Must function well in fast-paced, team-oriented environment.
o Position must be filled immediately.
Multimedia Designer
 The look and feel of a multimedia project should be pleasing and aesthetic, as well as
inviting and engaging.
 Screens should present an appealing mix of color, shape, and type. The project should maintain
visual consistency, using only those elements that support the overall message of the program.
 Navigation clues should be clear and consistent, icons should be meaningful, and screen
elements should be simple and straightforward.
 If the project is instructional, its design should be sensitive to the needs and styles of its
learner population, demonstrate sound instructional principles, and promote mastery of subject
matter.
 Graphic designers, illustrators, animators, and image processing specialists deal with the visuals.
 Instructional designers are specialists in education or training and make sure that the subject
matter is clear and properly presented for the intended audience.
 Interface designers devise the navigation pathways and content maps. Information designers
structure content, determine user pathways and feedback, and select presentation media based
Page 59 of 69
on an awareness of the strengths of the many separate media that make up multimedia. All can be
multimedia designers.
Kurt Andersen
 Kurt Andersen is an instructional designer and was a senior designer at the George Lucas
Educational Foundation, where he designed multimedia prototypes for middle school math and
science curricula.
 A multimedia designer often wears many hats, but most importantly he or she looks at the
overall content of a project, creates a structure for the content, determines the design elements
required to support that structure, and then decides which media are appropriate for presenting
which pieces of content.
 In essence, the multimedia designer (sometimes called an information designer) prepares the
blueprint for the entire project: content, media, and interaction.
 Multimedia designers need a variety of skills. You need to be able to analyze content
structurally and match it up with effective presentation methods.
Multimedia Designer/Producer
 Seeking an experienced, new-media professional who loves inventing the future and enjoys the
challenge of integrating complex information and media systems.
 Our ideal candidate has solid experience in interface design, product prototyping, and marketing
communication
 Knowledge of image manipulation is critical, as well as proven skills in Lingo scripting and the
use of digital time-based authoring tools.
 Must have experience designing large information and/or entertainment systems.
 Must have experience creating system flows and program architectures.
 Must have solid organizational skills and attention to detail.

Interface Designer
 Like a good film editor, an interface designer‘s best work is never seen by the viewer—it‘s
―transparent.‖
 In its simplest form, an interface provides control to the people who use it.
 It also provides access to the ―media‖ part of multimedia, meaning the text, graphics, animation,
audio, and video— without calling attention to itself.
 The elegant simplicity of a multimedia title screen, the ease with which a user can move about
within a project, effective use of windows, backgrounds, icons, and control panels—these are
the result of an interface designer‘s work.

Page 60 of 69
Nicole Lazzaro
 Nicole Lazzaro is an award-winning interface designer with XEODesign in Oakland, California,
and teaches interface design at San Francisco State University‘s Multimedia Studies Program.
She spends her days thinking of new ways to design multimedia interfaces that feel more like
real life.
 The role of an interface designer is to create a software device that organizes the multimedia
content, lets the user access or modify that con- tent, and presents the content on screen. These
three areas
 Information design,
 Interactive design, and
 Media design—are central to the creation of any interface, and of course they
overlap.
Writer
 Multimedia writers do everything writers of linear media do, and more. They create character,
action, and point of view—a traditional scriptwriter‘s tools of the trade—and they also create
interactivity.
 They write proposals, they script voice-overs and actors‘ narrations, they write text screens to
deliver messages, and they develop characters designed for an interactive environment.
 Writers of text screens are sometimes referred to as content writers.
 Domenic Stansberry is a writer/designer who has worked on interactive multimedia dramas
for commercial products. He has also written for documentary film and published two books of
fiction.
Video Specialist
 Prior to the 2000s, producing video was extremely expensive, requiring a large crew and
expensive equipment.
 The result is that video images delivered in a multi- media production have improved from
postage- stamp-sized windows playing at low frame
 rates to full-screen (or nearly full-screen) windows playing at 30 frames per second.
 As shooting, editing, and preparing video has migrated to an all-digital format and become
increasingly affordable to multimedia developers, video elements have become more and more
part of the multimedia mix.
 For high-quality productions, it may still be necessary for a video specialist to be responsible for
an entire team of videographers, sound technicians, lighting designers, set designers, script
supervisors, gaffers, grips, production assistants, and actors.
 Video Specialist wanted for multimedia production. Must have strong background in video

Page 61 of 69
direction, nonlinear editing, and preparing digital video for efficient delivery.
 Good understanding of shooting for inter- active programming required.
 A background working with Ultimatte green screens for compositing live video with computer-
generated backgrounds a plus.
Audio Specialist
 The quality of audio elements can make or break a multimedia project. Audio specialists are the
wizards who make a multimedia program come alive, by designing and producing music, voice-
over narrations, and sound effects.
 They perform a variety of functions on the multimedia team and may enlist help from one or
many others, including composers, audio engineers, or recording technicians.
 Audio specialists may be responsible for locating and selecting suitable music and talent,
scheduling recording sessions, and digitizing and editing recorded material into computer files
(see Chapter 4).
Multimedia Audio Specialist
 Audio specialist needed for multimedia project.
 Must have strong background in studio recording techniques— preferably with time spent
in the trenches as an engineer in a commercial studio working on a wide range of projects.
 Must be comfortable working with computers and be open and able to learn new technology and
make it work, with high-quality results.
 Familiar- ity with standard recording practices, knowledge of music production, and the ability to
work with artists a definite plus.
 Requires fluency in MIDI; experience with sequencing software, patch librarians, and
synth programming; and knowledge of sampling/samplers, hard disk recording, and editing.
Multimedia Programmer
 A multimedia programmer or software engineer integrates all the multimedia elements of a
project into a seamless whole using an authoring system or programming language.
 Multimedia programming functions range from coding simple displays of multimedia elements
to controlling peripheral devices and managing complex timing, transitions, and record keeping
 Creative multimedia programmers can coax extra (and sometimes unexpected) performance
from multimedia-authoring and programming systems.
 Without programming talent, there can be no multimedia. Code, whether written in JavaScript,
OpenScript, Lingo, RevTalk, PHP, Java, or C++, is the sheet music played by a well-orches-
trated multimedia project.
Hal Wine
 Hal Wine is a programmer familiar with both the Windows and Macintosh environments. In his

Page 62 of 69
many years of experience, he has worked in most of the important areas of computing and for
many of the leading computing companies.
 Interactive Programmer (HTML, JavaScript, Flash, PHP, and C/C++) needed to work on
multimedia prototyping and authoring tools for DVD and interactive web-based projects.
 Thorough knowledge of ActionScript, JavaScript, Flash, HTML5, PHP, and C/C++, Macintosh
and Windows environments required.
 Must have working familiarity with digital media, particularly digital video
 Must have a demonstrated track record of delivering quality programming on tight schedules.
 Must function well in fast-paced, team-oriented environment.
 Knowledge of AJAX methodologies desired.
Producer of Multimedia for the Web
 Web site producer is a new occupation, but putting together a coordinated set of pages for the
World Wide Web requires the same creative process, skill sets, and (often) teamwork as any
kind of multimedia does.

Kevin Edwards
Kevin Edwards is Senior Multimedia Producer for CNET, a publicly traded media company
that integrates television programming with a network of sites on the World Wide Web.
 In both types of media, CNET provides information about computers, the Internet, and future
technology using engaging content and design.
 CNET has about two million members on the Internet, and its television programming—which
airs on the USA Network, on the Sci-Fi Channel, and in national syndication—reaches an
estimated weekly audience of more than eight million viewers.
Some related areas listed by the bureau include

 Artists and related workers


 Multi-Media Artists
 Animators
 Designers
 Motion picture production and distribution
 Television, video, and motion picture camera operators and editors Writers and editors

Page 63 of 69
Unit – IV

The Internet and Multimedia: Internet History, Internetworking, Multimedia on the Web.

Designing for the World Wide Web: Developing for the Web, Text for the Web, Images for the Web,
Sound for the Web, Animation for the Web, Video for the Web.

Delivering: Testing, Preparing for Delivery, Delivering on CD-ROM, DVD and World Wide Web,
Wrapping.

Page 64 of 69
CHAPTER - 1
The Internet and Multimedia

Internet History
The history of the Internet is a complex and fascinating journey that spans several decades. Here's a brief
overview of key milestones:
1. 1960s: The Birth of the ARPANET
 The origins of the Internet can be traced back to the U.S. Department of Defense's Advanced
Research Projects Agency (ARPA), which funded the development of the ARPANET
(Advanced Research Projects Agency Network) in the late 1960s.
 The first ARPANET link was established in 1969 between the University of California, Los
Angeles (UCLA), and the Stanford Research Institute (SRI).
2. 1970s: Expansion and Email
 ARPANET continued to expand, connecting more universities and research institutions.
 The first email program was developed by Ray Tomlinson in 1971, allowing users to send
messages between different machines on the ARPANET.
3. 1980s: TCP/IP and the World Wide Web
 The development of the Transmission Control Protocol (TCP) and Internet Protocol (IP)
standards in the late 1970s laid the foundation for the modern Internet.
 In 1983, ARPANET adopted TCP/IP as its standard, ensuring compatibility between different
computer systems.
 The Domain Name System (DNS) was introduced to translate human-readable domain names
into IP addresses.
 Tim Berners-Lee invented the World Wide Web in 1989, and the first website went live in
1991. This marked the beginning of a user-friendly, interconnected system of information.
4. 1990s: Commercialization and Popularization
 The 1990s saw the commercialization of the Internet, with the emergence of Internet Service
Providers (ISPs) and the introduction of graphical web browsers like Mosaic and Netscape
Navigator.
 Online services, such as AOL and CompuServe, gained popularity.
 E-commerce began to thrive, and companies like Amazon and eBay were founded.
 The "dot-com bubble" occurred in the late 1990s, with many Internet-related companies
experiencing rapid growth and subsequent crashes.

Page 65 of 69
5. 2000s: Broadband, Social Media, and Mobile Internet
 The 2000s witnessed the widespread adoption of broadband Internet, providing faster and
more reliable connections.
 Social media platforms like Facebook, Twitter, and LinkedIn emerged, transforming the way
people connect and share information.
 The rise of smartphones and mobile devices led to increased mobile Internet usage.
6. 2010s: Cloud Computing and Streaming Services
 Cloud computing became mainstream, enabling users to store and access data and
applications online.
 Streaming services, such as Netflix and Spotify, gained prominence, revolutionizing the way
people consume media.
 The Internet of Things (IoT) emerged, connecting various devices to the Internet for data
exchange.
7. 2020s: Continued Innovation and Challenges
 The Internet continues to evolve with advancements in technologies like 5G, artificial
intelligence, and block chain.
 Cyber security concerns, online privacy issues, and debates over net neutrality are ongoing
challenges.
 The COVID-19 pandemic highlighted the importance of the Internet for remote work,
education, and communication.

Internetworking
In its simplest form, a network is a cluster of computers, with one computer acting as a server to
provide network services such as file transfer, e-mail, and document printing to the client computers or users
of that network. Using gateways and routers, a local area network (LAN) can be connected to other LANs
to form a wide area network (WAN). These LANs and WANs can also be connected to the Internet
through a server that provides both the necessary software for the Internet and the physical data connection
(usually a high-bandwidth telephone line, coaxial cable TV line, or wireless). Individual computers not
permanently part of a network (such as a home computer or a laptop) can connect to one of these Internet
servers and, with proper identification and onboard client software, obtain an IP address on the Internet (see
―IP Addresses and Data Packets‖ later in the chapter).

Internet Addresses
Let‘s say you get into a taxi at the train station in Trento, Italy, explain in English or Spanish or
German or French that you wish to go to the Mozzi Hotel, and half an hour later you are let out of the car in
a suburban Wood—you have an address problem. You will quickly discover, as you return to the city in the

Page 66 of 69
back of a bricklayer‘s lorry to report your missing luggage and the cab driver, Mauro, who sped away in the
rain, that you also have a serious language problem. If you know how addresses work and understand the
syntax or language of the Internet, you will likely not get lost and will save much time and expense during
your adventures. You will also be able to employ shortcuts and workarounds.

Top-Level Domains
When the original ARPANET protocols for communicating among computers were remade into the
current scheme of TCP/IP (Transmission Control Protocol/Internet Protocol) in 1983, the Domain Name
System (DNS) was developed to rationally assign names and addresses to computers linked to the Internet.
Top-level domains (TLDs) were established as categories to accommodate all users of the Internet:

In late 1998, the Internet Corporation for Assigned Names and Numbers (ICANN) was set up to
oversee the technical coordination of the Domain Name System, which allows Internet addresses to be found
by easy-to-remember names instead of one of 4.3 billion individual IP numbers. In late 2000, ICANN
approved seven additional TLDs:

Second-Level Domains
Many second-level domains contain huge numbers of computers and user accounts representing
local, regional, and even international branches as well as various internal business and management
functions. So the Internet addressing scheme provides for subdomains that can contain even more
subdomains. Like a finely carved Russian matryoshka doll, individual workstations live at the epicenter of a
Page 67 of 69
cluster of domains. Within the education (.edu) domain containing hundreds of universities and colleges, for
example, is a second-level domain for Yale University called yale. At that university are many schools and
departments (medicine, engineering, law, business, computer science, and so on), and each of these entities
in turn has departments and possibly subdepartments and many users. These departments operate one or
even several servers for managing traffic to and from the many computers in their group and to the outside
world. At Yale, the server for the Computing and Information Systems Department is named cis. It manages
about 11,000 departmental accounts—so many accounts that a cluster of three subsidiary servers was
installed to deal efficiently with the demand. These subsidiary servers are named minerva, morpheus, and
mercury. Thus, minerva lives in the cis domain, which lives in the yale domain, which lives in the edu
domain. Real people‘s computers are networked to minerva. Other real people are connected to the
morpheus and mercury servers.

IP Addresses and Data Packets


When a stream of data is sent over the Internet by your computer, it is first broken down into packets
by the Transmission Control Protocol (TCP). Each packet includes the address of the receiving computer, a
sequence number (―this is packet #5‖), error correction information, and a small piece of your data. After a
packet is created by TCP, the Internet Protocol (IP) then takes over and actually sends the packet to its
destination along a route that may include many other computers acting as forwarders. TCP/IP is two
important Internet protocols working in concert.
The 32-bit address included in a data packet, the IP address, is the ―real‖ Internet address. It is
made up of four numbers separated by periods, for example, 140.174.162.10. Some of these numbers are
assigned by Internet authorities, and some may be dynamically assigned by an Internet service provider
(ISP) when a computer logs on using a subscriber‘s account. There are domain name servers throughout the
Internet whose sole job is to quickly look up text-based domain name addresses in large them to you for
insertion into your data packets. Every time you connect to http://www.google.com or send mail to
[email protected], the domain name server is consulted and the destination address is converted
to numbers.

Internet Services
To many users, the Internet means the World Wide Web. But the Web is only the latest and most popular of
services available today on the Internet. E-mail; file transfer; discussion groups and newsgroups; real-time
chatting by text, voice, and video; and the ability to log into remote computers are common as well. Internet
services are shown here.

Page 68 of 69
Each Internet service is implemented on an Internet server by dedicated software known as a
daemon. (Actually, daemons only exist on Unix/Linux systems—on other systems, such as Windows, the
services may run as regular applications or background processes.) Daemons are agent programs that run in
the background, waiting to act on requests from the outside. In the case of the Internet, daemons support
protocols such as the Hypertext Transfer Protocol (HTTP) for the World Wide Web, the Post Office
Protocol (POP) for e-mail, or the File Transfer Protocol (FTP) for exchanging files. You have probably
noticed that the first few letters of a Uniform Resource Locator (URL)—for example, http://www
.timestream.com/index.html—notify a server as to which daemon to bring into play to satisfy a request. In
many cases, the daemons for the Web, mail, news, and FTP may run on completely different servers, each
isolated by a security firewall from other servers on a network.

Page 69 of 69

You might also like