Multimedia SYsytem Unit 1,2,3,4 Except Last 2 Parts
Multimedia SYsytem Unit 1,2,3,4 Except Last 2 Parts
Multimedia SYsytem Unit 1,2,3,4 Except Last 2 Parts
BA (Computer Applications)
SEMESTER – IV
Unit - I
Multimedia: Introduction, Definitions, Where to Use Multimedia- Multimedia in Business, Schools, Home,
Public Places, Virtual Reality; Delivering Multimedia.
Text: Meaning, Fonts and Faces, Using Text in Multimedia, Computers and Text, Font Editing and Design
Tools, Hypermedia and Hypertext.
Images: Before We Start to Create, Making Still Images, Color.
Unit - II
Sound: The Power of Sound, Digital Audio, MIDI Audio, MIDI vs. Digital Audio, Multimedia System
Sounds, Audio File Formats, Adding Sound to Our Multimedia Project.
Animation: The Power of Motion, Principles of Animation, Animation by Computer, Making Animations.
Video: Using Video, How Video Works and Is Displayed, Digital Video Containers, Obtaining Video Clips,
Shooting and Editing Video.
Unit - III
Making Multimedia: The Stages of a Multimedia Project, the Intangibles, Hardware, Software, Authoring
Systems.
Designing and producing: designing the structure, designing the user interface, a multimedia design case
history, producing.
Unit - IV
The Internet and Multimedia: Internet History, Internetworking, Multimedia on the Web.
Designing for the World Wide Web: Developing for the Web, Text for the Web, Images for the Web,
Sound for the Web, Animation for the Web, Video for the Web.
Delivering: Testing, Preparing for Delivery, Delivering on CD-ROM, DVD and World Wide Web,
Wrapping.
Page 1 of 69
Unit – I
Multimedia: Introduction, Definitions, Where to Use Multimedia- Multimedia in Business, Schools, Home,
Text: Meaning, Fonts and Faces, Using Text in Multimedia, Computers and Text, Font Editing and Design
Page 2 of 69
Unit -1
Chapter – 1
1. Introduction to Multimedia:
Multimedia refers to the integration of various forms of media elements, such as text, graphics,
audio, video, and interactive content, to convey information or entertainment. It involves the combination of
different media types to create a more engaging and complete experience for the audience. Multimedia can
be found in various aspects of our daily lives, including entertainment, education, business presentations,
and communication.
Definitions:
1. Broad Definition: Multimedia is the use of multiple forms of media to present information or
entertain an audience. It includes a range of media elements such as text, images, audio, video,
and interactive content.
2. Technical Definition: In a technical sense, multimedia involves the integration of different
media types through computer technology. This can include the use of software applications to
create, edit, and present multimedia content.
2. Where to Use Multimedia
1. Multimedia in Business:
Presentations: Businesses use multimedia extensively for creating dynamic and engaging
presentations. This includes slideshows with images, videos, and audio to convey information
effectively.
Marketing and Advertising: Multimedia is a key component of marketing strategies.
Companies use videos, animations, and interactive content for online advertisements, product
demonstrations, and promotional campaigns.
Training and Development: Multimedia is employed for employee training programs. It
includes interactive training modules, simulations, and multimedia content to enhance
learning experiences.
Web Design: Business websites often incorporate multimedia elements to make them
visually appealing and to convey information more effectively. This includes images, videos,
and interactive features.
2. Multimedia in Schools:
Page 3 of 69
Online Education: With the rise of online education, multimedia is integral to delivering course
content. It includes video lectures, interactive quizzes, and other multimedia elements for a richer
learning experience.
Digital Libraries: Schools and educational institutions use multimedia to create digital libraries,
providing students with access to a variety of educational resources in different formats.
3. Multimedia at Home:
Digital Signage: Public places such as malls, airports, and transit stations use multimedia for
digital signage. This includes displaying advertisements, announcements, and information on
digital screens.
Interactive Kiosks: Multimedia kiosks are used in public places to provide information and
services. They may include touchscreens, videos, and interactive content to assist users.
Museums and Exhibitions: Multimedia is commonly used in museums and exhibitions to
enhance visitor experiences. This includes interactive displays, audio guides, and multimedia
presentations.
Events and Conferences: Multimedia plays a crucial role in events and conferences. This
includes presentations, live streaming, and interactive displays to engage attendees and convey
information effectively.
In summary, multimedia is universal across various settings, moving communication,
education, entertainment, and information consumption in business, schools, homes, and public
places.
5. Virtual Reality (VR):
Virtual reality (VR) is used in many industries, including:
Entertainment: VR is used in video games, 3D cinema, amusement park rides, and social
virtual worlds.
Healthcare: VR is used for training and surgery.
Education: VR offers new methods for teaching and learning.
Page 4 of 69
Architecture: VR is used for architecture and urban design.
Digital marketing: VR is used for digital marketing.
Engineering and robotics: VR is used for engineering and robotics.
VR can also be used for: Automotive, Space and military, Occupational safety, Retail, Real estate.
Multimedia and VR are supported by their interactivity. VR headsets can be used in physical environments
with special effects, or locally with an offline computer, game system, or simulator.
Gaming: VR is extensively used in the gaming industry to provide immersive gaming experiences.
Training Simulations: VR is utilized for training simulations in various fields, such as healthcare,
aviation, and military training.
Virtual Tours: VR is used to create virtual tours for real estate, tourism, and historical sites.
3. Delivering Multimedia
Delivering multimedia involves the transmission or presentation of various forms of content, such as
text, audio, video, graphics, and animations, to a target audience. The method of delivery can vary
depending on the context and the intended purpose. Here are common ways to deliver multimedia:
1. Web Platforms:
Websites: Multimedia content can be hosted on websites, providing users with the ability to access
and view content through web browsers. This includes streaming videos, interactive graphics, and
other multimedia elements.
2. Streaming Services:
Video Streaming: Platforms like WeTube, Netflix, and Vimeo allow users to stream videos in real-
time. Live streaming services are also used for events, webinars, and online broadcasts.
Audio Streaming: Services like Spotify, Apple Music, and podcasts deliver audio content in a
streaming format.
3. Social Media:
Video and Image Sharing: Social media platforms like Facebook, Instagram, Twitter, and TikTok
enable users to share and view multimedia content, including videos, images, and animations.
4. Email and Newsletters:
Multimedia Attachments: Multimedia content can be shared through email as attachments or
embedded within the email body. Newsletters often include multimedia elements for engagement.
5. Presentations and Slideshows:
Microsoft PowerPoint, Google Slides: Multimedia is commonly used in presentations to enhance
communication. Slideshows can include images, videos, and audio elements.
6. Educational Platforms:
Page 5 of 69
Learning Management Systems (LMS): Multimedia is delivered in educational settings through
online platforms, where students can access course materials, videos, interactive simulations, and
assessments.
7. Mobile Applications:
Apps: Multimedia content is delivered through mobile applications, providing users with interactive
and engaging experiences on smartphones and tablets.
8. Digital Signage:
Public Displays: Multimedia content is often delivered through digital signage in public places,
including malls, airports, and transportation hubs, for information, advertisements, and
announcements.
9. Virtual Reality (VR):
VR Platforms: Virtual reality content is delivered through specialized platforms and applications,
offering immersive experiences in gaming, training, and virtual tours.
10.Broadcasting:
Television and Radio: Traditional broadcasting methods are still relevant for delivering multimedia
content. TV and radio broadcasts include a mix of audio and visual elements.
11.CDs/DVDs and Physical Media:
Physical Storage: Multimedia content can be distributed through CDs, DVDs, and other physical
storage media. This is less common in the age of digital delivery but is still used for certain purposes.
When delivering multimedia, it's essential to consider the target audience, the purpose of the
content, and the most suitable platform or method for reaching them effectively. Advances in technology
continue to influence how multimedia is created and delivered, with an emphasis on accessibility,
interactivity, and user engagement.
Chapter -2
Text in Multimedia
All multimedia content consists of texts in some form. Even a menu text is accompanied by a single action
such as mouse click, keystroke or finger pressed in the monitor (in case of a touch screen). The text in the
multimedia is used to communicate information to the user. Proper use of text and words in multimedia
presentation will help the content developer to communicate the idea and message to the user.
Meaning:
Words and symbols in any form, spoken or written, are the most common system of communication.
They deliver the most widely understood meaning to the greatest number of people. Most academic related
text such as journals, e-magazines are available in the Web Browser readable form.
Page 6 of 69
About Fonts and Faces in multimedia
In the context of multimedia, the terms "fonts" and "faces" are often used interchangeably, both referring to
the visual representation of characters and text. Here's an overview of these concepts and their significance
in multimedia design:
Fonts in Multimedia:
1. Definition:
A font refers to a set of characters with a consistent style, size, and weight. It includes letters,
numbers, punctuation marks, and symbols that share a unified design.
Typeface is family of graphic characters that usually includes many type sizes and styles. A font is a
collection of characters of a single size and style belonging to a particular typeface family. Typical
font styles are bold face and italic. Other style attributes such as underlining and outlining of
characters, may be added at the users choice.
The size of a text is usually measured in points. One point is approximately 1/72 of an inch i.e.
0.0138. The size of a font does not exactly describe the height or width of its characters. This is
because the x-height (the height of lower case character x) of two fonts may differ.
Typefaces of fonts can be described in many ways, but the most common characterization of a
typeface is serif and sans serif. The serif is the little decoration at the end of a letter stroke. Times,
Times New Roman, Bookman are some fonts which comes under serif category. Arial, Optima,
Verdana are some examples of sans serif font. Serif fonts are generally used for body of the text for
better readability and sans serif fonts are generally used for headings. The following fonts shows a
few categories of serif and sans serif fonts.
Page 7 of 69
The different effects and colors of a font can be chosen in order to make the text look in a distinct
manner.
Anti-aliased can be used to make a text look gentle and blended.
For special attention to the text the words can be wrapped onto a sphere or bent like a wave.
Meaningful words and phrases can be used for links and menu items.
Using Text in Multimedia
PURPOSE
1. To guide the user in navigating through the application.
2. To explain how application work
3. Deliver the information for which the application was designed.
# Text consists of two structures:
Linear
Non-linear
# Linear:
A single way to progress through the text, starting at the beginning and reading to the end.
# Non-linear:
Information is represented in a semantic network in which multiple related sections of the next are
connected to each other.
A user may then browser trough the section of the next, jumping from one text section to another
Why text is important?
# Factors affecting legibility of text
1. Size: the size of the text
2. Background and foreground color: The color in which the text is written in / on
3. Style: Also known as typeface and font.
4. Leading:
Refers to the amount of added spaces between lines of type.
Originally, when type was set by hand for printing presses, printers placed slugs, strips
of lead of various thicknesses, between lines of type to add space.
Text technology
1. Based on creating letters, numbers and special characters.
2. May also include special icon or drawing symbols, mathematical symbol, Greek letter etc[©™≈ƒ]
3. Text elements can be categories into:
Alphabet characters: A-Z
Numbers: 0-9
Page 8 of 69
special characters: Punctuation [. , ; ‗ …..] , Sign or symbols [*&^%$#@!…..]
Also known Character Sets.
FONT VS TYPEFACE
Font
1. A ‗font‘ is a collection of characters of a particular size and style belonging to a particular typeface
family.
2. Usually vary by type sizes and styles.
3. The sizes are measure in points
4. This includes the letter set, the number set, and all of the special character and diacritical marks we
get by pressing the shift, option, or command /control keys.
Typeface
1. A ‗typeface‘ is a family of graphic characters that usually includes many type sizes and styles.
2. A typeface contains a series of fonts. For instance, Arial, Arial Black, Arial Narrow and Arial
Unicode MS are actually 4 fonts under the same family.
Arial
Arial Black
Arial Narrow
Arial Unicode MS
Font Effects
The technology of font effects in bringing viewer‘s attention to content
Case : UPPER and lower cased letter
Bold, Italic, Underline, Superscript and Subscript
Embossed or Shadow
Colours
Strikethrough
Page 9 of 69
Leading of Text
Spacing above and below a font or line spacing
TYPES OF FONTS
Two classes of fonts
SERIF TEXT
Decorative strokes added to the end of a letter‘s
Serifs improve readability by leading the eye along the line of type
Serifs are the best suited for body text
Serif faces are more difficult to read in small scale (smaller than 8pt) and in very large sizes.
SANS SERIF TEXT
Sans serif faces doesn‘t have decorative strokes
A sans serif text has to be read letter by letter.
Use sans serif faces for small (smaller than 8pt) and very large sizes
Used for footnotes and headlines
USING TEXT IN MULTIMEDIA
The text elements used in multimedia are:
Menus for navigation
interactive buttons
Page 10 of 69
Fields for reading
HTML documents
Symbols and icon
TEXT APPLYING GUIDELINES:
Be concise
Use appropriate fonts
Make it readable
Consider type style and colors
Use restraint and be consistent
Page 12 of 69
Special font editing tools can be used to make our own type so we can communicate an idea or
graphic feeling exactly. With these tools professional typographers create distinct text and display faces.
1. Fontographer:
It is macromedia product; it is a specialized graphics editor for both Macintosh and Windows
platforms. We can use it to create postscript, TrueType and bitmapped fonts for Macintosh and Windows.
4. Hypermedia Structures:
Two Buzzwords used often in hypertext are link and node. Links are connections between the
conceptual elements, that is, the nodes that ma consists of text, graphics, sounds or related information in the
knowledge base.
1. Hypertext:
Definition: Hypertext refers to text that contains links (hyperlinks) to other texts, allowing
users to navigate between related pieces of information in a non-linear manner.
Page 13 of 69
Usage: In multimedia systems, hypertext is commonly employed in websites, e-books, and
interactive presentations. Users can click on hyperlinks within the text to access additional
information, related content, or navigate to different sections.
2. Hypermedia:
Definition: Hypermedia extends the concept of hypertext by including various types of
media beyond text, such as images, audio, video, and interactive elements, all interconnected
through hyperlinks.
Usage: Hypermedia is prevalent in modern multimedia systems, where users can navigate not
only through text but also through a mix of media types. This enhances the overall user
experience and allows for more dynamic and engaging content.
Key features and considerations regarding hypermedia and hypertext in multimedia systems include:
Interactivity: Hypermedia and hypertext provide interactivity by allowing users to make choices and
navigate through the content based on their preferences. This non-linear approach contrasts with
traditional linear media.
Navigation Structure: Hypermedia systems often have a hierarchical or networked structure, with
nodes representing individual pieces of content interconnected through hyperlinks. This structure
allows users to move seamlessly between different nodes.
Rich Media Integration: Hypermedia systems incorporate various media formats, enabling a richer
and more immersive experience. This can include images, audio, video, animations, and interactive
elements, all linked together to convey information more dynamically.
User Engagement: The non-linear nature of hypermedia and hypertext encourages user engagement
and exploration. Users have the flexibility to choose their own path through the content, making the
experience more personalized.
Educational and Informational Applications: Hypermedia is widely used in educational
multimedia, interactive tutorials, and informational databases. It allows for the creation of dynamic
learning environments where users can explore topics at their own pace.
Web Browsing: The World Wide Web is a prime example of a hypermedia system. Web pages are
interconnected through hyperlinks, enabling users to navigate from one page to another, accessing a
variety of content types along the way.
Authoring Tools: Various authoring tools and multimedia development platforms support the
creation of hypermedia content. These tools often provide features for linking different types of media
and designing interactive navigation.
Cross-Platform Compatibility: Hypermedia content is designed to be accessible across different
devices and platforms. Web browsers, multimedia players, and other applications support the
rendering and interaction of hyperlinked content.
Page 14 of 69
Chapter - 3
Images
Introduction and Before we start to create:
Still images are the important element of a multimedia project or a web site. In order to make a
multimedia presentation look elegant and complete, it is necessary to spend ample amount of time to design
the graphics and the lawets. Competent, computer literate skills in graphic art and design are vital to the
success of a multimedia project.
Images play a crucial role in multimedia systems, enriching the overall user experience by adding
visual elements to complement other forms of media such as text, audio, and video. Here are some key
aspects related to images in multimedia systems:
Here are some key features related to images in multimedia systems:
1. Formats and Compression:
Images can be stored in various formats such as JPEG, PNG, GIF, BMP, and others. Each
format has its own advantages and is suitable for different types of images and use cases.
Compression techniques are often applied to reduce the file size of images, making them
more manageable for storage and faster to transmit over networks.
2. Color Models:
Images are typically represented using color models such as RGB (Red, Green, Blue) or
CMYK (Cyan, Magenta, Yellow, Key/Black). The choice of color model depends on the
application and requirements of the multimedia system.
3. Resolution:
The resolution of an image refers to the number of pixels it contains. Higher resolution
images generally provide more detail but may require more storage space and computational
resources.
4. Image Editing and Processing:
Multimedia systems often incorporate image editing and processing capabilities to
manipulate and enhance images. This may include tasks such as cropping, resizing, filtering,
and applying various effects.
5. Integration with Other Media:
Images are frequently combined with other forms of media, such as text, audio, and video, to
create a rich multimedia experience. Presentations, websites, and educational materials often
leverage a combination of media types.
Page 15 of 69
6. Multimedia Authoring Tools:
Specialized software tools exist for creating multimedia content, allowing users to integrate
and synchronize different media types, including images. These tools often have features for
arranging, editing, and enhancing images within a multimedia project.
7. Interactive Multimedia:
In interactive multimedia applications, users may have control over how they interact with
images. This can include zooming, panning, rotating, or clicking on specific areas for
additional information.
8. Virtual and Augmented Reality:
In virtual reality (VR) and augmented reality (AR) applications, images are a fundamental
component, providing the visual content that users interact with in a simulated or enhanced
environment.
9. Accessibility:
Multimedia systems need to consider accessibility, ensuring that images are described
adequately for users with visual impairments. This is often done through alternative text or
other accessibility features.
10.Streaming and Delivery:
Images, especially in web-based multimedia systems, need to be delivered efficiently.
Techniques such as lazy loading and content delivery networks (CDNs) are employed to
optimize image loading times.
Page 16 of 69
Digital Image Format
There are different kinds of image formats in the literature. We shall consider the image format that
comes out of an image frame grabber, i.e., the captured image format, and the format when images are
stored, i.e., the stored image format.
Bitmaps
A bitmap is a simple information matrix describing the individual dots that are the smallest elements of
resolution on a computer screen or other display or printing device. A one-dimensional matrix is required for
monochrome (black and white); greater depth (more bits of information) is required to describe more than
16 million colors the picture elements may have, as illustrated in following figure. The state of all the pixels
on a computer screen make up the image seen by the viewer, whether in combinations of black and white or
colored pixels in a line of text, a photograph-like picture, or a simple background pattern.
Page 17 of 69
Clip Art
A clip art collection may contain a random collection of images, or it may contain a series of
graphics, photographs, sound, and video related to a single topic. For example, Corel, Micrografx, and
Fractal Design bundle extensive clip art collection with their image-editing software.
Bitmap Software
The abilities and feature of image-editing programs for both the Macintosh and Windows range from
simple to complex. The Macintosh does not ship with a painting tool, and Windows provides only the
rudimentary Paint (see following figure), so we will need to acquire this very important software separately
– often bitmap editing or painting programs come as part of a bundle when we purchase our computer,
monitor, or scanner.
Page 18 of 69
Capturing and Editing Images
The image that is seen on a computer monitor is digital bitmap stored in video memory, updated
about every 1/60 second or faster, depending upon monitors scan rate. When the images are assembled for
multimedia project, it may often be needed to capture and store an image directly from screen. It is possible
to use the Prt Scr key available in the keyboard to capture an image.
Scanning Images
After scanning through countless clip art collections, if it is not possible to find the unusual
background we want for a screen about gardening. Sometimes when we search for something too hard, we
don‘t realize that it‘s right in front of our face. Open the scan in an image-editing program and experiment
with different filters, the contrast, and various special effects. Be creative, and don‘t be afraid to try strange
combinations – sometimes mistakes yield the most intriguing results.
Vector Drawing
Most multimedia authoring systems provide for use of vector-drawn objects such as lines, rectangles,
ovals, polygons, and text.
Computer-aided design (CAD) programs have traditionally used vector-drawn object systems for
creating the highly complex and geometric rendering needed by architects and engineers.
Graphic artists designing for print media use vector-drawn objects because the same mathematics
that put a rectangle on our screen can also place that rectangle on paper without jaggies. This requires the
higher resolution of the printer, using a page description language such as PostScript.
Programs for 3-D animation also use vector-drawn graphics. For example, the various changes of
position, rotation, and shading of light required to spin the extruded.
Color
Color is a vital component of multimedia. Management of color is both a subjective and a technical
exercise. Picking the right colors and combinations of colors for our project can involve many tries until we
feel the result is right.
Page 19 of 69
and violet. Ultraviolet light, on the other hand, is beyond the higher end of the visible spectrum and can be
damaging to humans.
The color white is a noisy mixture of all the color frequencies in the visible spectrum. The cornea of
the eye acts as a lens to focus light rays onto the retina. The light rays stimulate many thousands of
specialized nerves called rods and cones that cover the surface of the retina. The eye can differentiate among
millions of colors, or hues, consisting of combination of red, green, and blue.
Additive Color
In additive color model, a color is created by combining colored light sources in three primary
colors: red, green and blue (RGB). This is the process used for a TV or computer monitor.
Subtractive Color
In subtractive color method, a new color is created by combining colored media such as paints or ink
that absorb (or subtract) some parts of the color spectrum of light and reflect the others back to the eye.
Subtractive color is the process used to create color in printing. The printed page is made up of tiny halftone
dots of three primary colors, cyan, magenta and yellow (CMY).
Page 20 of 69
Unit – II
Sound: The Power of Sound, Digital Audio, MIDI Audio, MIDI vs. Digital Audio, Multimedia System
Animation: The Power of Motion, Principles of Animation, Animation by Computer, Making Animations.
Video: Using Video, How Video Works and Is Displayed, Digital Video Containers, Obtaining Video Clips,
Page 21 of 69
Chapter – 1
Sound
Introduction
Sound is possibly the most important element of multimedia. It is meaningful ―speech‖ in any
language, from an undertone to a scream. It can provide the listening pleasure of music, the amazing
pronunciation of special effects or the feel of a mood setting background. Sound is the terminology used in
the analog form, and the digitized form of sound is called as audio.
Power of Sound
When something vibrates in the air is moving back and forward it creates wave of pressure. These
waves spread like waves from stone tossed into a still pool and when it reaches the eardrums, the change of
pressure or vibration is experienced as sound.
Sound quality is the branch of physics that studies sound. Sound pressure levels are measured in
decibels (db); a decibel measurement is actually the ratio between a chosen reference point on a logarithmic
scale and the level that is actually experienced.
The multimedia application user can use sound right off the bat on both the
Macintosh and on a multimedia PC running Windows because beeps and warning sounds
are available as soon as the operating system is installed. On the Macintosh we can
choose one of the several sounds for the system alert. In Windows system sounds are
WAV files and they reside in the windows\Media subdirectory.
There are still more choices of audio if Microsoft Office is installed. Windows makes use
of WAV files as the default file format for audio and Macintosh systems use SND as
default file format for audio.
Digital Audio:
Digital audio is created when a sound wave is converted into numbers – a process
referred to as digitizing. It is possible to digitize sound from a microphone, a synthesizer,
existing tape recordings, live radio and television broadcasts, and popular CDs. We can
digitize sounds from a natural source or prerecorded.
Digitized sound is sampled sound. Ever nth fraction of a second, a sample of
sound is taken and stored as digital information in bits and bytes. The quality of this
digital recording depends upon how often the samples are taken.
Page 22 of 69
Preparing Digital Audio Files:
Preparing digital audio files is fairly straight forward. If we have analog source
materials – music or sound effects that we have recorded on analog media such as
cassette tapes.
The first step is to digitize the analog material and recording it onto a computer
readable digital media.
It is necessary to focus on two crucial aspects of preparing digital audio files:
Balancing the need for sound quality against our available RAM and
Hard disk resources.
Setting proper recording levels to get a good, clean recording.
Remember that the sampling rate determines the frequency at which samples will
be drawn for the recording. Sampling at higher rates more accurately captures the high
frequency content of our sound. Audio resolution determines the accuracy with which a
sound can be digitized.
Formula for determining the size of the digital audio
Once a recording has been made, it will almost certainly need to be edited. The
basic sound editing operations that most multimedia procedures needed are described in
the paragraphs that follow
1. Multiple Tasks: Able to edit and combine multiple tracks and then merge the
tracks and export them in a final mix to a single audio file.
2. Trimming: Removing dead air or blank space from the front of a recording and
an unnecessary extra time off the end is our first sound editing task.
Page 23 of 69
3. Splicing and Assembly: Using the same tools mentioned for trimming, we will
probably want to remove the extraneous noises that inevitably creep into
recording.
4. Volume Adjustments: If we are trying to assemble ten different recordings into
a single track there is a little chance that all the segments have the same volume.
5. Format Conversion: In some cases our digital audio editing software might
read a format different from that read by our presentation or authoring program.
Actual Sound is stored in a digital audio file. No actual sound is stored in the MIDI file.
Files are large in size and are loose. Files are small in size and compact.
Page 25 of 69
Digital Audio MIDI
The quality of sound is in proportion to the The quality of sound is not in proportion to the file
file size. size.
They reproduce the exact sound in a digital They sound a little different from the original
format. sound.
Adding sound to a multimedia project can enhance the overall experience and engagement for the
audience. Depending on the type of multimedia project you are working on, there are various ways to
incorporate sound. Here are some general guidelines and methods you can use:
1. Audio Formats:
Ensure that your audio files are in a compatible format. Common formats include MP3, WAV,
and AAC.
Choose a format that balances quality and file size, depending on the platform and purpose of
your multimedia project.
2. Sound Editing Software:
Use sound editing software to create and edit your audio files. Popular tools include Audacity,
Adobe Audition, and Garage Band.
Edit your audio files to remove any unwanted background noise, adjust volume levels, and make
other necessary enhancements.
3. Narration:
If your multimedia project involves a presentation or storytelling, consider adding a narration
track. This can be a voiceover explaining the content or providing additional context.
Ensure that the narration is clear, well-paced, and synchronized with the visuals.
4. Background Music:
Background music can add atmosphere and emotion to your multimedia project. Choose music
that complements the mood and theme of your content.
Adjust the volume levels so that the music doesn't overpower other audio elements or distract
from the main content.
Page 26 of 69
5. Sound Effects:
Incorporate sound effects to enhance specific actions or events in your multimedia project. For
example, footsteps, doorbell rings, or applause can make your content more dynamic.
Ensure that the sound effects are relevant and not overly loud or distracting.
6. Timing and Synchronization:
Pay attention to the timing and synchronization of your audio elements with the visual
components. Proper synchronization enhances the overall impact of your multimedia project.
Use timeline-based editing features in multimedia authoring tools to align sound events with
visual cues.
7. Multimedia Authoring Tools:
Use multimedia authoring tools like Adobe Animate, Adobe Premiere, or other similar
platforms that allow you to integrate audio seamlessly with visuals.
These tools often provide features for layering audio tracks, adjusting volume levels, and
synchronizing audio with animations.
8. Compression:
Compress audio files appropriately to balance file size and audio quality, especially if your
multimedia project is intended for online distribution.
9. Testing:
Test your multimedia project on different devices and platforms to ensure that the audio plays
correctly and is well-balanced across various environments.
10. Accessibility:
Consider accessibility by providing options for subtitles or transcripts, especially if your
multimedia project includes spoken content.
Page 27 of 69
Chapter – 2
Animation
Introduction
Animation makes static presentations come alive. It is visual change over time and can add great
power to our multimedia projects. Carefully planned, well-executed video clips can make a dramatic
difference in a multimedia project. Animation is created from drawn pictures and video is created using real
time visuals.
Principles of Animation
Animation is the rapid display of a sequence of images of 2-D artwork or model positions in order to
create an illusion of movement. It is an optical illusion of motion due to the phenomenon of persistence of
vision, and can be created and demonstrated in a number of ways. The most common method of presenting
animation is as a motion picture or video program, although several other forms of presenting animation also
exist.
Animation is possible because of a biological phenomenon known as persistence of vision and a
psychological phenomenon called phi. An object seen by the human eye remains chemically mapped on the
eye‘s retina for a brief time after viewing. Combined with the human mind‘s need to conceptually complete
a perceived action, this makes it possible for a series of images that are changed very slightly and very
rapidly, one after the other, to seemingly blend together into a visual illusion of movement. The following
Shows a few cells or frames of a rotating logo. When the images are progressively and rapidly changed, the
arrow of the compass is perceived to be spinning.
Television video builds entire frames or pictures every second; the speed with which each frame is
replaced by the next one makes the images appear to blend smoothly into movement. To make an object
travel across the screen while it changes its shape, just change the shape and also move or translate it a few
pixels for each frame.
Page 29 of 69
Animation Techniques
When you create an animation, organize its execution into a series of logical steps. First, gather up in your
mind all the activities you wish to provide in the animation; if it is complicated, you may wish to create a
written script with a list of activities and required objects. Choose the animation tool best suited for the job.
Then build and tweak your sequences; experiment with lighting effects. Allow plenty of time for this phase
when you are experimenting and testing. Finally, post-process your animation, doing any special rendering
and adding sound effects.
i. Cel Animation
The term cel derives from the clear celluloid sheets that were used for drawing each frame, which have
been replaced today by acetate or plastic. Cels of famous animated cartoons have become sought-after,
suitable-for-framing collector‘s items. Cel animation artwork begins with keyframes (the first and last frame
of an action). For example, when an animated figure of a man walks across the screen, he balances the
weight of his entire body on one foot and then the other in a series of falls and recoveries, with the opposite
foot and leg catching up to support the body.
The animation techniques made famous by Disney use a series of progressively different on each
frame of movie film which plays at 24 frames
A minute of animation may thus require as many as 1,440 separate frames.
The term cel derives from the clear celluloid sheets that were used for drawing each frame, which is
been replaced today by acetate or plastic.
Cel animation artwork begins with keyframes.
Page 30 of 69
In 3D animation the animator puts his effort in creating the models of individual and designing the
characteristic of their shapes and surfaces.
Paint is most often filled or drawn with tools using features such as gradients and anti- aliasing.
iii. Kinematics
It is the study of the movement and motion of structures that have joints, such as a walking man.
Inverse Kinematics is in high-end 3D programs, it is the process by which you link objects such as
hands to arms and define their relationships and limits.
Once those relationships are set you can drag these parts around and let the computer calculate the
result.
iv. Morphing
Morphing is popular effect in which one image transforms into another. Morphing application and
other modeling tools that offer this effect can perform transition not only between still images but
often between moving images as well.
The morphed images were built at a rate of 8 frames per second, with each transition taking a total
of 4 seconds.
Some product that uses the morphing features are as follows
o Black Belt‘s EasyMorph and WinImages,
o Human Software‘s Squizz
o Valis Group‘s Flo , MetaFlo, and MovieFlo.
Page 31 of 69
Chapter – 3
Video
The video has become an integral part of multimedia, improving the overall user experience and
engagement. Most people today would rather watch a video tutorial on YouTube than read a step-by-step
tutorial. That‘s why every content creator – even if they create e-books, flipbooks, or other types
of multimedia presentation – should know how to use digital video files in their multimedia marketing.
Analog video
Analog video refers to the representation of visual information using continuous, variable signals. In the
context of multimedia systems, analog video has been widely used in the past for various applications, such
as television broadcasting, VHS tapes, and older video cameras. Here are some key aspects of analog video
in multimedia systems:
1. Signal Representation:
Continuous Waveform: Unlike digital video, which represents information using discrete
values (pixels), analog video uses continuous waveforms to represent the changing voltage
levels corresponding to the varying intensity of light in a scene.
Composite Video: In many analog video systems, the entire visual information (including
color and brightness) is combined into a single composite signal.
Page 32 of 69
2. Resolution and Quality:
Limited Resolution: Analog video typically has lower resolution compared to digital
formats. The quality of the video is influenced by factors such as the bandwidth of the signal
and the recording/playback equipment.
3. Transmission and Storage:
Broadcasting: Analog video signals were traditionally used for broadcasting television
signals. These signals were transmitted over the air or through cable systems.
VHS Tapes: Analog video was commonly recorded and stored on VHS tapes. The quality of
the recorded video depended on the tape format and the recording device.
4. Color Encoding:
Analog Color Systems: Analog video systems often used different methods for encoding
color information. One common method is NTSC (National Television System Committee)
in North America, PAL (Phase Alternating Line) in Europe and parts of Asia, and SECAM
(Sequential Couleur avec Mémorisation) in some other regions.
5. Drawbacks of Analog Video:
Signal Degradation: Analog signals are susceptible to degradation and interference during
transmission or copying, resulting in a loss of picture quality.
Noisy Playback: Analog video playback can suffer from noise, distortion, and artifacts over
time.
6. Transition to Digital:
Advantages of Digital Video: The transition to digital video formats has brought several
advantages, including higher resolution, better signal quality, and ease of editing and
manipulation.
Digital Multimedia Systems: Modern multimedia systems predominantly use digital video
formats for recording, storage, and playback.
Digital video
Digital video has become the dominant format in modern multimedia systems due to its numerous
advantages over analog video. Here are key aspects of digital video in multimedia systems:
1. Representation of Information:
Discrete Values: Digital video represents visual information using discrete values, typically
in the form of pixels. Each pixel has a specific color and brightness value, contributing to the
overall image.
2. Resolution and Quality:
High Resolution: Digital video supports higher resolutions compared to analog, allowing for
sharper and more detailed images.
Page 33 of 69
HD and 4K: Digital video commonly includes high-definition (HD) and ultra-high-definition
(4K) formats, providing improved clarity and visual fidelity.
3. Compression:
Efficient Storage: Digital video can be compressed using various codecs (compression-
decompression algorithms) to efficiently store and transmit video data without significant
loss of quality.
Streaming: Compression enables the streaming of high-quality video content over the
internet, contributing to the popularity of online video platforms.
4. Color Representation:
RGB: Digital video often uses the RGB (Red, Green, Blue) color model to represent colors,
providing a wide and accurate range of color possibilities.
YCbCr: YCbCr is another common color representation in digital video, separating
luminance (brightness) and chrominance (color information) components.
5. Editing and Manipulation:
Non-destructive Editing: Digital video allows for non-destructive editing, where changes
can be made without degrading the original source. This includes cutting, merging, and
applying various effects.
Special Effects: Digital video facilitates the incorporation of special effects, CGI (Computer-
Generated Imagery), and other post-production enhancements.
6. Storage and Distribution:
Media Files: Digital video is typically stored as files in formats like MP4, AVI, MKV, or
others.
Media Servers: Digital video can be easily distributed and accessed through media servers,
streaming services, and online platforms.
7. Playback Devices:
Diverse Platforms: Digital video can be played on a wide range of devices, including
computers, smartphones, tablets, smart TVs, and dedicated media players.
Compatibility: The standardization of digital video formats ensures broad compatibility
across different devices and software applications.
8. Multichannel Audio:
Surround Sound: Digital video often includes multichannel audio formats, enabling
immersive surround sound experiences.
9. Interactive Features:
Interactivity: Digital video supports interactive features, such as clickable annotations,
subtitles, and menu navigation in the case of DVDs or Blu-ray discs.
Page 34 of 69
10. Evolution and Advancements:
3D Video: Digital video can support 3D formats for a more immersive viewing experience.
HDR (High Dynamic Range): Digital video can incorporate HDR technology, enhancing
contrast and color for a more lifelike image.
Page 35 of 69
publishing your content. This includes choosing the right angle, lighting, background, sound, transitions,
effects, or captions to ensure consistency and coherence. Finally, before you launch your product
demonstration, you need to test your video and multimedia on different devices, browsers and platforms to
ensure accessibility and interactivity.
Video Formats
In practice, file extensions are used synonymously with video formats. For instance, MP4 in
"Videofile.mp4''. However, this isn‘t entirely correct.
Most file formats comprise a combination of files, folders, and playlists (TS, M3U8, etc)—which are
necessary to play a video properly.
Page 36 of 69
It is important to understand that Video Formats are different from Video File formats/File
extensions, i.e: MOV (QuickTime Movie), WMV (Windows Media Viewer), AVI (Audio Video
Interleave), MP4 (MPEG-4 Part 14), etc.
Some of the most popular video streaming formats today are MP4, MPEG-DASH, and HLS.
Video Containers
File extensions for video files actually represent Containers—which contain the entire gamut of files
required to play a video. This information includes the metadata and video & audio stream.
The video stream is to instruct the video player as to what should be displayed on the screen,
whereas—
The audio stream ensures the right sound is played for the specific video.
Page 37 of 69
The metadata, or ―data about data‖, comprises a slew of information on the video file, for instance,
its resolution, date of creation or modification, bit-rate type, subtitles, and so on.
Video Codecs
As is evident, the codec is a combination of words resulting from the coder and decoder.
Codecs encode video or audio streams to create more manageable and streamable sizes of video and audio
files.
The video player or platform on which the video is played then decodes it depending on the
information contained in that codec and plays back the video while maintaining the quality of the original.
Similar to containers, there is a slew of different codecs in existence today to be used with different audio
and video files—some of which include H.264, H.265, VP9, AAC, MP3, and so on.
Page 38 of 69
Cons:
Decoding H.265 may require more computational power compared to H.264.
Some implementations of H.265 may involve licensing fees.
Use Case: Ideal for applications requiring high compression efficiency, such as streaming UHD content and
video surveillance.
VP9
VP9 is an open and royalty-free video codec developed by Google as a successor to VP8.
Pros:
Achieves good video quality at lower bitrates, similar to H.265.
Support for VP9 is present in devices, improving playback efficiency.
Cons:
VP9 may not achieve the same level of compression efficiency as newer codecs like AV1, resulting in
larger file sizes compared to more advanced alternatives.
VP9 decoding requires more computational power compared to older codecs like H.264, potentially
leading to performance issues on less powerful devices.
Use Case: Commonly used for streaming high-quality videos on the web, particularly in platforms that
prioritize royalty-free codecs.
AV1
AV1 is an open and royalty-free video codec developed by the Alliance for Open Media (AOMedia).
Pros:
Designed to provide advanced compression efficiency, potentially surpassing H.265 and VP9.
AV1 is an open standard with no licensing fees, making it cost-effective for content creators.
Cons:
AV1 encoding is slow compared to some other codecs, which may impact the efficiency of the content
creation process, especially for real-time applications.
AV1's widespread adoption is still in progress, and some platforms or devices may not fully support this
codec, leading to compatibility challenges.
Use Case: Emerging as a preferred codec for high-quality streaming and online video services due to its
efficiency and royalty-free nature.
Common Video Formats Lists
MP4
MP4 is a versatile video format widely used for its compatibility across devices and platforms.
Pros:
Utilizes efficient codecs like H.264 for effective video compression without sacrificing quality.
Allows embedding metadata such as subtitles, multiple audio tracks, and chapter information.
Page 39 of 69
Cons:
MP4 files can be less editable compared to some other formats like AVI.
Excessive compression may lead to noticeable quality degradation.
Use Case: Ideal for sharing videos online, streaming, and playing on a diverse range of devices.
MOV
MOV is a multimedia container format developed by Apple for high-quality video playback.
Pros:
Commonly associated with Apple devices, MOV supports high-quality video and audio playback.
Allows the use of various video and audio codecs, providing versatility in content creation.
Cons:
Uncompressed MOV files can be large, requiring significant storage space.
MOV files may face compatibility issues on non-Apple devices.
Use Case: Well-suited for high-quality video production, editing, and playback within the Apple ecosystem.
AVI
AVI is a multimedia container format developed by Microsoft for storing video and audio data.
Pros:
Supports various video and audio codecs, providing flexibility in content creation.
AVI files generally have lower compression overhead, preserving original video quality.
Cons:
Compared to newer formats, AVI has limited support for embedded metadata.
Uncompressed or less compressed AVI files can result in large file sizes.
Use Case: Suitable for video editing, as it allows for minimal compression and maintains high-quality
source footage.
WebM
WebMwith is an open, royalty-free multimedia container format designed for efficient web video streaming.
Pros:
WebM is an open standard with no licensing fees, making it cost-effective for content creators.
Utilizes VP9 and AV1 codecs for efficient compression, enabling high-quality streaming.
Cons:
Not as widely used in commercial video production compared to other formats.
Some older devices and software may not fully support WebM, leading to compatibility issues.
Use Case: Optimized for streaming high-quality videos on the web, especially in scenarios where royalty-
free formats are preferred
Page 40 of 69
Shooting and editing video
To add full-screen, full-motion video to your multimedia project, you will need to invest in
specialized hardware and software or purchase the services of a professional video production studio. In
many cases, a professional studio will also provide editing tools and post-production capabilities
The video for windows is an external set of software works along with multimedia extension for
windows. It has the feature for digitized video recording, playback and Editing. The video cap utility of this
software is used to capture the video and audio clips using external hardware. The captured sequence can be
viewed in a number of different size and speed and also different colour palates can be created for individual
frame. Video for windows has four different types of editing features named as Video Edit, PalEdit, Waved
it and BitEdit. As the name suggests
Video Edit is used to cut and paste captured video segments together,
Wav Edit is the feature which work with the recorded digital audio and helps you to edit it.
Pal Edit is the work with the colour plates within the captured video to improve the colour,
Bit Edit helps clean up the rough patches in the images. It also has the interface to the media
control panel to control digital video files.
1. Compositions: Composition is at the heart of making attractive video, because it focuses not on
things like story line and plot development, or even the more technical issues of color balance,
lighting and audio levels. Rather, composition is all about the placement of your subject(s) in the
frame so that the effect is as pleasing to the eye as possible.
2. Video Compression: Video takes up a lot of space. Uncompressed recording from a camcorder
takes up about 17MB per second of video. Because it takes up so much space, video must be
compressed before it is used. ―Compressed‖ means that the information is packed into a smaller
space. There are two kinds of compression: lossy and lossless.
3. Lossy compression: Lossy compression means that the compressed file has less data in it than
the original file. In some cases this translates to lower quality files, because information has been
―lost‖. Lossy compression makes up for the loss in quality by producing comparatively small files.
For example, DVDs are compressed using the MPEG-2 format, which can make files 15 to 30 times
smaller, but we still tend to perceive DVDs as having high-quality picture.
4. Lossless compression: Lossless compression is exactly what it sounds like, compression where
none of the information is lost. This is not nearly as useful because files often end up being the same
size as they were before compression as reducing the file size is the primary goal of compression.
However, if file size is not an issue, using lossless compression will result in a perfect-quality
picture. For example, a video editor transferring files from one computer to another using a hard
drive might choose to use lossless compression to preserve quality while he or she is working.
Page 41 of 69
5. Lighting: Perhaps the greatest difference between professional camcorders and consumer
camcorders is their ability to perform at low light levels. Using a simple floodlight kit, or even just
being sure that daylight illuminates the room, can improve your image. Onboard battery lights for
camcorders can be useful but only in conditions where the light acts as a "fill light" to illuminate the
details of a subject's face. The standard lighting arrangement of a studio is displayed with fill, key,
rim, and background lights. Changing any of these lights can make a dramatic difference in the shot.
6. Chroma keys: Chroma keys allow you to choose a color or range of colors that become
transparent, allowing the video image to be seen 'through" the computer image. This is the
technology used by a newscast's weather person, who is shot against a blue background that is made
invisible when merged with the electronically generated image of the weather map.
7. Blue screen: Blue screen is a popular technique for making multimedia titles because expensive
sets are not required. Incredible backgrounds can be generated using 3-D modeling and graphic
software, and one or more actors, vehicles, or other objects can be neatly layered onto that
background.
Page 42 of 69
Unit – III
Making Multimedia: The Stages of a Multimedia Project, the Intangibles, Hardware, Software, Authoring
Systems.
Designing and producing: designing the structure, designing the user interface, a multimedia design case
history, producing.
Page 43 of 69
CHAPTER 1: MAKING MULTIMEDIA
Definition: How do you make multimedia?
Multi- media is made, with guidance and suggestions for getting started, and learn about
planning a project. Learn about producing, managing, and designing a project; getting material
and content; testing the work; and, ultimately, shipping it to end users or posting it to the
Web.
Page 44 of 69
4. Delivering
In the workplace, use quality equipment and software for your communications setup. The
cost—in both time and money—of stable and fast networking.
Page 47 of 69
Connection Transfer Rate
Serial port 115 Kbps (0.115 Mbps)
Standard parallel port 115 Kbps (0.115 Mbps)
USB (Original 1.0) 12 Mbps (1.5 Mbps)
SCSI-2 (Fast SCSI) 80 Mbps
SCSI (Wide SCSI) 160 Mbps
Ultra2 SCSI 320 Mbps
FireWire 400 (IEEE 1394) 400 Mbps
USB (Hi-Speed 2.0) 480 Mbps
SCSI (Wide Ultra2) 640 Mbps
FireWire 800 (IEEE 1394) 800 Mbps
SCSI (Wide Ultra3) 1,280 Mbps
SATA 150 1,500 Mbps
SCSI (Ultra4) 2,560 Mbps
SATA 300 3,000 Mbps
FireWire 3200 (IEEE 1394) 3,144 Mbps
USB (Super-Speed 3.0) 3,200 Mbps
SCSI (Ultra5) 5,120 Mbps
SATA 600 6,000 Mbps
Fibre Channel (Optic) 10,520 Mbps
Table Maximum Transfer Rates for Various Connections in Megabits Per Second
SCSI
The Small Computer System Interface (SCSI—pronounced ―scuzzy‖) adds peripheral
equipment such as disk drives, scanners, CD-ROM play-ers, and other peripheral devices that
conform to the SCSI stan- dard.
SCSI connections may connect internal devices such as hard drives that are inside the chassis of
computer‘s power supply, and external devices, which are outside the chasis are plugged into
the computer.
IDE, EIDE, Ultra IDE, ATA, and Ultra ATA
Integrated Drive Electronics (IDE) connections, also known as Advanced Technology
Attachment (ATA) connections, are typically only internal, and they connect hard disks, CD-
ROM drives, and other peripherals mounted inside the PC.
With IDE controllers, you can install a combination of hard disks, CD-ROM drives, or other
devices in your PC.
Page 48 of 69
The circuitry for IDE is typically much less expensive than for SCSI, but comes with some
limitations. For example, IDE requires time from the main processor chip, so only one drive in a
master/slave pair can be active at once.
USB
A consortium of industry players including Compaq, Digital Equipment, IBM, Intel, Microsoft,
NEC, and Northern Telecom was formed in 1995.
To promote a Universal Serial Bus (USB) standard for connecting devices to a computer.
These devices are automatically recognized (―plug-and- play‖) and installed without users
needing to install special cards or turn the computer off and on when making the connection
(allowing ―hot- swapping‖).
USB technology has improved in performance.
USB uses a single cable to connect as many as 127 USB peripherals to a single personal computer.
Hubs can be used to ―daisy- chain‖ many devices. USB connections are now common on video
game consoles, cameras, GPS locators, cell phones, televisions, MP3 players, PDAs, and
portable memory devices.
Memory and Storage Devices
Add more memory and storage space to your computer, gigabyte hard disk;
To estimate the memory requirements of a multimedia project—the space required on a hard
disk, thumb drive, CD-ROM, or DVD, not the random access memory (RAM) used while your
computer is running— you must have a sense of the project‘s content and scope.
Color images, text, sound bites, video clips, and the programming code that glues it all together
require memory
Making multimedia, you will also need to allocate memory for storing and archiving working files
used during production, original audio and video clips, edited pieces, and final mixed pieces,
production paperwork and correspondence, and at least one backup of your project files, with a
second backup stored at another location.
Random Access Memory (RAM)
Faced with budget constraints, you can certainly produce a multimedia project on a slower or
limited- memory computer.
Fast processor with- out enough RAM may waste processor cycles while it swaps needed
portions of program code into and out of memory.
Increasing available RAM may show more performance improvement on the system than
upgrading the processor chip.
Read-Only Memory (ROM)
RAM, read-only memory (ROM) is not volatile.
Page 49 of 69
When turn off the power to a ROM chip, it will not forget, or lose its memory.
ROM is typically used in computers to hold the small BIOS program that initially boots
up the computer, and it is used in printers to hold built-in fonts.
Hard Disks
Adequate storage space for your production environment can be provided by large-capacity hard
disks, server-mounted on a network.
As multi- media has reached consumer desktops, makers of hard disks have built smaller-
profile, larger-capacity, faster, and less-expensive hard disks.
As network and Internet servers drive the demand for centralized data storage requiring terabytes
(one trillion bytes), hard disks are often configured into fail-proof redundant arrays offering
built-in protection against crashes.
CD-ROM Discs
Compact disc read-only memory (CD-ROM) players have become an integral part of the
multimedia development workstation and are an important delivery vehicle for mass-produced
projects.
A wide variety of developer utilities, graphic backgrounds, stock photography and sounds,
applications, games, reference texts, and educational software are available on this medium.
CD-ROM players have typically been very slow to access and transmit data (150 KBps, which
is the speed required of consumer Audio CDs), but developments have led to double-, triple-,
quadruple- speed, 24x, 48x, and 56x drives designed specifically for computer (not Red Book
Audio) use.
With a compact disc recorder, you can make your own CDs, using CD-recordable (CD-R) blank
discs to create a CD in most formats of CD-ROM and CD-Audio.
A CD-RW (read and write) recorder can rewrite 700MB of data to a CD-RW disc about 1,000
times.
Digital Versatile Discs (DVD)
In December 1995, nine major electronics companies (Toshiba, Matsushita, Sony, Philips, Time
Warner, Pioneer, JVC, Hitachi, and Mitsubishi Electric) agreed to promote a new optical disc
technology for distribution of multimedia and feature-length movies called Digital Versatile
Disc (DVD)
With a DVD capable not only of gigabyte storage capacity but also full-motion video (MPEG2)
and high-quality audio in surround sound, this is an excellent medium for delivery of
multimedia projects
There are three types of DVD, including
o DVD-Read Write,
Page 50 of 69
o DVD-Video, and
o DVD-ROM.
These types reflect marketing channels, not the technology.
DVD Feature DVD Specification Blu-ray Specification
Disc diameter 120 mm (5 inches) 120 mm (5 inches)
Disc thickness 1.2 mm (0.6 mm thick disc × 2) 1.2 mm (0.6 mm thick disc
× 2)
Memory capacity 4.7 gigabytes/single side 25 gigabytes/single layer
Wave length of laser diode 650 nanometer/635 nanometer (red) 405 nanometer
violet) (blue-
Data transfer rate 1x Variable speed data transfer at an average rate of 4.69 Mbps for
and sound image Variable speed data transfer at an average rate of 36
and sound Mbps for image
Image compression MPEG2 digital image compression MPEG-2 Part 2,
AVC, and SMPTE H.264/MPEG-4
VC-1
Audio Dolby AC-3 (5.1 ch), LPCM for NTSC and Dolby Digital
3), DTS, and linear PCM (AC-
MPEG Audio, LPCM for PAL/SECAM (a maxi-
mum of 8 audio channels and 32 subtitle
channels can be stored)
Running time (movies) Single Layer (4.7GB): 133 minutes a side (at an average
data rate of 4.69 Mbps for image and sound, including three audio channels and four subtitle
channels) Single Layer (25GB): Encoded using MPEG-2 video, about
two hours of HD content; using VC-1 or MPEG-4 AVC codecs, about 4 hours of HD quality video
and audio
What You Need: Software
Multimedia software tells the hardware what to do.
Display the color red. Move that tiger three leaps to the left. Slide in the words ―Now You‘ve
Done It!‖ from the right and blink them on and off.
Play the sound of cymbals crashing. Run the digitized trailer for Avatar. Turn down the volume
on that MP3 file!
The basic tool set for building multimedia projects contains one or more authoring systems and
various editing applications for text, images, sounds, and motion video.
A few additional applications are also useful for capturing images from the screen, translating
Page 51 of 69
file for- mats, and moving files among computers.
The tools used for creating and editing multimedia elements on both Windows and Macintosh
platforms do image processing and editing, drawing and illustration, 3-D and CAD, OCR and
text editing, sound recording and editing, video and moviemaking, and various utilitarian
housekeeping tasks.
Text Editing and Word Processing Tools
A word processor is usually the first software tool computer users learn.
From letters, invoices, and storyboards to project content, word processor may also be most often
used tool, as the design and build a multimedia project.
The better keyboarding or typing skills, the easier and more efficient your multimedia day-to-
day life will be.
An office or workgroup will choose a single word processor to share documents in a standard
format. And most often, that word procWord processors such as Microsoft Word and
WordPerfect are power- ful applications that include spell checkers, table formatters,
thesauruses, and prebuilt templates for letters, résumés, purchase orders, and other common
documents.
Many developers have begun to use OpenOffice (www.openoffice.org) for word processing,
spreadsheets, presentations, graphics, databases, and more.
It can be downloaded and used completely free of charge for any purpose and is available in
many languages.
It can read and write files from other, more expensive, office packages. In many word processors,
can embed multimedia elements such as sounds, images, and video.
OCR Software
OCR software turns bitmapped characters into electronically recognizable ASCII text.
A scanner is typically used to create the bitmap. Then the software breaks the bitmap into
chunks according to whether it contains text or graphics, by examining the texture and density
of areas of the bitmap and by detecting edges.
The text areas of the image are then converted to ASCII characters using probability and expert
system algorithms.
Most OCR applications claim about 99 percent accuracy when reading 8- to 36-point printed
characters at 300 dpi and can reach processing speeds of about 150 characters per second.
These programs do, how- ever, have difficulty recognizing poor copies of originals where the
edges of characters have bled; these and poorly received faxes in small print may yield more
recognition errors than it is worthwhile to correct after the attempted recognition.
Page 52 of 69
Painting and Drawing Tools
Painting and drawing tools, as well as 3-D modelers, are perhaps the most important items in
your toolkit because, of all the multimedia elements, the graphical impact of your project will
likely have the greatest influence on the end user.
Painting software, such as Photoshop, Fireworks, and Painter, is dedicated to producing
crafted bitmap images.
Drawing software, such as CorelDraw, FreeHand, Illustrator, Designer, and Canvas, is
dedicated to producing vector-based line art easily printed to paper at high resolution.
Some vector-based packages such as Macromedia‘s Flash are aimed at reducing file download
times on the Web and may contain both bitmaps and drawn art.
Look for these features in a drawing or painting package:
An intuitive graphical user interface with pull-down menus, status bars, palette control, and dialog
boxes for quick, logical selection.
Scalable dimensions, so that you can resize, stretch, and distort both large and small bitmaps
Paint tools to create geometric shapes, from squares to circles and from curves to complex
polygons
The ability to pour a color, pattern, or gradient into any area
The ability to paint with patterns and clip art
Customizable pen and brush shapes and sizes
An eyedropper tool that samples colors
An autotrace tool that turns bitmap shapes into vector-based outlines
Support for scalable text fonts and drop shadow
Multiple undo capabilities, to let you try again history function for redoing effects, drawings, and
text
A property inspector
A screen capture facility
Painting features such as smoothing coarse-edged objects into the back- ground with anti-
aliasing (see illustration); airbrushing in variable sizes, shapes, densities, and patterns; washing
colors in gradients; blending; and masking.
Support for third-party special-effect plug-ins
Object and layering capabilities that allow you to treat separate elements independently
Zooming, for magnified pixel editing
All common color depths: 1-, 4-, 8-, and 16-, 24-, or 32-bit color, and gray-scale
Good color management and dithering capability among color depths using various color models
such as RGB, HSB, and CMYK
Good palette management when in 8-bit mode
Page 53 of 69
Good file importing and exporting capability for image formats such as PIC, GIF, TGA, TIF,
PNG, WMF, JPG, PCX, EPS, PTN, and BMP.
Animation, Video, and Digital Movie Tools
Animations and digital video movies are sequences of bitmapped graphic scenes (frames),
rapidly played back.
But animations can also be made within the authoring system by rapidly changing the location
of objects, or sprites, to generate an appearance of motion. Most authoring tools adopt either a
frame- or object-oriented approach to animation, but rarely both.
To make movies from video, you may need special hardware to convert an analog video signal to
digital data. Macs and PCs with FireWire (IEEE 1394) or USB ports can import digital video
directly from digital camcorders.
Moviemaking tools such as Premiere, Final Cut Pro, VideoShop, and MediaStudio Pro let to edit
and assemble video clips captured from camera, tape, other digitized movie segments, animations,
scanned images, and from digitized audio or MIDI files. The completed clip, often with added
transition and visual effects, can then be played back—either stand- alone or windowed within
your project
Video productions
Animations
Games
Interactive web sites
Demo disks and guided tours
Presentations
Kiosk applications
Page 54 of 69
Interactive training
Simulations, prototypes, and technical
visualizations Helpful Ways to get Started
Consider the following tips for making production work go smoothly:
Use templates that people have already created to set up your production. These can include
appropriate styles for all sorts of data, font sets, color arrangements, and particular page setups that will
save you time.
Use wizards when they are available—they may save much time and pre-setup work.
Use named styles, because if take the time to create our own it will really slow down.
Unless your client specifically requests a particular style, you will save a great deal of
time using something already created, usable, and legal.
Create tables, which can build with a few keystrokes in many pro- grams, and it makes
the production look credible.
Help readers find information with tables of contents, running headers and footers, and
indexes.
Improve document appearance with bulleted and numbered lists and symbols.
Allow for a quick-change replacement using the global change feature.
Reduce grammatical errors by using the grammar and spell checker provided with the
software. Do not rely on that feature, though, to set all things right— still need to
proofread everything.
Making Instant Multimedia
Common desktop tools have become multimedia-powerful.
Some multimedia projects may be so simple that you can cram all the organizing, planning,
rendering, and testing stages into a single effort, and make ―instant‖ multimedia. We get many
more ways to effectively convey your message than just a slide show.
Types of Authoring Tools
Each multimedia project undertake will have its own underlying structure and purpose
and will require different features and functions.
E-learning modules such as those seen on PDAs, MP3 players, and intra-college
networks may include web-based teaching materials, multi- media CD-ROMs or web
sites, discussion boards, collaborative software, wikis, simulations, games, electric
voting systems, blogs, computer- aided assessment, simulations, animation, blogs,
learning management software, and e-mail.
This is also referred to as distance learning or blended learning, where online learning
is mixed with face-to-face learning.
Page 55 of 69
The various multimedia authoring tools can be categorized into three groups, based on the method used
for sequencing or organizing multi- media elements and events:
Card- or page-based tools
Icon-based, event-driven multimedia and game-authoring tools
Time based tools
Page 56 of 69
Choosing an Authoring Tool
Authoring tools are constantly being improved by their makers, who add new features and
increase performance with upgrade development cycles of six months to a year.
It is important that you study the software product reviews in the blogs and computer trade
journals, as well as talk with current users of these systems, before deciding on the best ones for
your needs. Here‘s what to look for
Editing Features
The elements of multimedia—images, animations, text, digital audio and MIDI music, and
video clips—need to be created, edited, and converted to standard file formats, using the
specialized applications
Also, editing tools for these elements, particularly text and still images, are often included in
your authoring system.
The editors that may come with an authoring system will offer only a subset of the substantial
features found in dedicated tools.
Playback Features
As you build your multimedia project, you will be continually assembling elements and testing to see
how the assembly looks and performs. Your authoring system should let you build a segment or part of
your project and then quickly test it as if the user were actually using it. You should spend a great deal
of time going back and forth between building and testing as you refine and smooth the content and
timing of the project. You may even want to release the project to others who you trust to run it ragged
and show you its weak points.
Interface Designer
Like a good film editor, an interface designer‘s best work is never seen by the viewer—it‘s
―transparent.‖
In its simplest form, an interface provides control to the people who use it.
It also provides access to the ―media‖ part of multimedia, meaning the text, graphics, animation,
audio, and video— without calling attention to itself.
The elegant simplicity of a multimedia title screen, the ease with which a user can move about
within a project, effective use of windows, backgrounds, icons, and control panels—these are
the result of an interface designer‘s work.
Page 60 of 69
Nicole Lazzaro
Nicole Lazzaro is an award-winning interface designer with XEODesign in Oakland, California,
and teaches interface design at San Francisco State University‘s Multimedia Studies Program.
She spends her days thinking of new ways to design multimedia interfaces that feel more like
real life.
The role of an interface designer is to create a software device that organizes the multimedia
content, lets the user access or modify that con- tent, and presents the content on screen. These
three areas
Information design,
Interactive design, and
Media design—are central to the creation of any interface, and of course they
overlap.
Writer
Multimedia writers do everything writers of linear media do, and more. They create character,
action, and point of view—a traditional scriptwriter‘s tools of the trade—and they also create
interactivity.
They write proposals, they script voice-overs and actors‘ narrations, they write text screens to
deliver messages, and they develop characters designed for an interactive environment.
Writers of text screens are sometimes referred to as content writers.
Domenic Stansberry is a writer/designer who has worked on interactive multimedia dramas
for commercial products. He has also written for documentary film and published two books of
fiction.
Video Specialist
Prior to the 2000s, producing video was extremely expensive, requiring a large crew and
expensive equipment.
The result is that video images delivered in a multi- media production have improved from
postage- stamp-sized windows playing at low frame
rates to full-screen (or nearly full-screen) windows playing at 30 frames per second.
As shooting, editing, and preparing video has migrated to an all-digital format and become
increasingly affordable to multimedia developers, video elements have become more and more
part of the multimedia mix.
For high-quality productions, it may still be necessary for a video specialist to be responsible for
an entire team of videographers, sound technicians, lighting designers, set designers, script
supervisors, gaffers, grips, production assistants, and actors.
Video Specialist wanted for multimedia production. Must have strong background in video
Page 61 of 69
direction, nonlinear editing, and preparing digital video for efficient delivery.
Good understanding of shooting for inter- active programming required.
A background working with Ultimatte green screens for compositing live video with computer-
generated backgrounds a plus.
Audio Specialist
The quality of audio elements can make or break a multimedia project. Audio specialists are the
wizards who make a multimedia program come alive, by designing and producing music, voice-
over narrations, and sound effects.
They perform a variety of functions on the multimedia team and may enlist help from one or
many others, including composers, audio engineers, or recording technicians.
Audio specialists may be responsible for locating and selecting suitable music and talent,
scheduling recording sessions, and digitizing and editing recorded material into computer files
(see Chapter 4).
Multimedia Audio Specialist
Audio specialist needed for multimedia project.
Must have strong background in studio recording techniques— preferably with time spent
in the trenches as an engineer in a commercial studio working on a wide range of projects.
Must be comfortable working with computers and be open and able to learn new technology and
make it work, with high-quality results.
Familiar- ity with standard recording practices, knowledge of music production, and the ability to
work with artists a definite plus.
Requires fluency in MIDI; experience with sequencing software, patch librarians, and
synth programming; and knowledge of sampling/samplers, hard disk recording, and editing.
Multimedia Programmer
A multimedia programmer or software engineer integrates all the multimedia elements of a
project into a seamless whole using an authoring system or programming language.
Multimedia programming functions range from coding simple displays of multimedia elements
to controlling peripheral devices and managing complex timing, transitions, and record keeping
Creative multimedia programmers can coax extra (and sometimes unexpected) performance
from multimedia-authoring and programming systems.
Without programming talent, there can be no multimedia. Code, whether written in JavaScript,
OpenScript, Lingo, RevTalk, PHP, Java, or C++, is the sheet music played by a well-orches-
trated multimedia project.
Hal Wine
Hal Wine is a programmer familiar with both the Windows and Macintosh environments. In his
Page 62 of 69
many years of experience, he has worked in most of the important areas of computing and for
many of the leading computing companies.
Interactive Programmer (HTML, JavaScript, Flash, PHP, and C/C++) needed to work on
multimedia prototyping and authoring tools for DVD and interactive web-based projects.
Thorough knowledge of ActionScript, JavaScript, Flash, HTML5, PHP, and C/C++, Macintosh
and Windows environments required.
Must have working familiarity with digital media, particularly digital video
Must have a demonstrated track record of delivering quality programming on tight schedules.
Must function well in fast-paced, team-oriented environment.
Knowledge of AJAX methodologies desired.
Producer of Multimedia for the Web
Web site producer is a new occupation, but putting together a coordinated set of pages for the
World Wide Web requires the same creative process, skill sets, and (often) teamwork as any
kind of multimedia does.
Kevin Edwards
Kevin Edwards is Senior Multimedia Producer for CNET, a publicly traded media company
that integrates television programming with a network of sites on the World Wide Web.
In both types of media, CNET provides information about computers, the Internet, and future
technology using engaging content and design.
CNET has about two million members on the Internet, and its television programming—which
airs on the USA Network, on the Sci-Fi Channel, and in national syndication—reaches an
estimated weekly audience of more than eight million viewers.
Some related areas listed by the bureau include
Page 63 of 69
Unit – IV
The Internet and Multimedia: Internet History, Internetworking, Multimedia on the Web.
Designing for the World Wide Web: Developing for the Web, Text for the Web, Images for the Web,
Sound for the Web, Animation for the Web, Video for the Web.
Delivering: Testing, Preparing for Delivery, Delivering on CD-ROM, DVD and World Wide Web,
Wrapping.
Page 64 of 69
CHAPTER - 1
The Internet and Multimedia
Internet History
The history of the Internet is a complex and fascinating journey that spans several decades. Here's a brief
overview of key milestones:
1. 1960s: The Birth of the ARPANET
The origins of the Internet can be traced back to the U.S. Department of Defense's Advanced
Research Projects Agency (ARPA), which funded the development of the ARPANET
(Advanced Research Projects Agency Network) in the late 1960s.
The first ARPANET link was established in 1969 between the University of California, Los
Angeles (UCLA), and the Stanford Research Institute (SRI).
2. 1970s: Expansion and Email
ARPANET continued to expand, connecting more universities and research institutions.
The first email program was developed by Ray Tomlinson in 1971, allowing users to send
messages between different machines on the ARPANET.
3. 1980s: TCP/IP and the World Wide Web
The development of the Transmission Control Protocol (TCP) and Internet Protocol (IP)
standards in the late 1970s laid the foundation for the modern Internet.
In 1983, ARPANET adopted TCP/IP as its standard, ensuring compatibility between different
computer systems.
The Domain Name System (DNS) was introduced to translate human-readable domain names
into IP addresses.
Tim Berners-Lee invented the World Wide Web in 1989, and the first website went live in
1991. This marked the beginning of a user-friendly, interconnected system of information.
4. 1990s: Commercialization and Popularization
The 1990s saw the commercialization of the Internet, with the emergence of Internet Service
Providers (ISPs) and the introduction of graphical web browsers like Mosaic and Netscape
Navigator.
Online services, such as AOL and CompuServe, gained popularity.
E-commerce began to thrive, and companies like Amazon and eBay were founded.
The "dot-com bubble" occurred in the late 1990s, with many Internet-related companies
experiencing rapid growth and subsequent crashes.
Page 65 of 69
5. 2000s: Broadband, Social Media, and Mobile Internet
The 2000s witnessed the widespread adoption of broadband Internet, providing faster and
more reliable connections.
Social media platforms like Facebook, Twitter, and LinkedIn emerged, transforming the way
people connect and share information.
The rise of smartphones and mobile devices led to increased mobile Internet usage.
6. 2010s: Cloud Computing and Streaming Services
Cloud computing became mainstream, enabling users to store and access data and
applications online.
Streaming services, such as Netflix and Spotify, gained prominence, revolutionizing the way
people consume media.
The Internet of Things (IoT) emerged, connecting various devices to the Internet for data
exchange.
7. 2020s: Continued Innovation and Challenges
The Internet continues to evolve with advancements in technologies like 5G, artificial
intelligence, and block chain.
Cyber security concerns, online privacy issues, and debates over net neutrality are ongoing
challenges.
The COVID-19 pandemic highlighted the importance of the Internet for remote work,
education, and communication.
Internetworking
In its simplest form, a network is a cluster of computers, with one computer acting as a server to
provide network services such as file transfer, e-mail, and document printing to the client computers or users
of that network. Using gateways and routers, a local area network (LAN) can be connected to other LANs
to form a wide area network (WAN). These LANs and WANs can also be connected to the Internet
through a server that provides both the necessary software for the Internet and the physical data connection
(usually a high-bandwidth telephone line, coaxial cable TV line, or wireless). Individual computers not
permanently part of a network (such as a home computer or a laptop) can connect to one of these Internet
servers and, with proper identification and onboard client software, obtain an IP address on the Internet (see
―IP Addresses and Data Packets‖ later in the chapter).
Internet Addresses
Let‘s say you get into a taxi at the train station in Trento, Italy, explain in English or Spanish or
German or French that you wish to go to the Mozzi Hotel, and half an hour later you are let out of the car in
a suburban Wood—you have an address problem. You will quickly discover, as you return to the city in the
Page 66 of 69
back of a bricklayer‘s lorry to report your missing luggage and the cab driver, Mauro, who sped away in the
rain, that you also have a serious language problem. If you know how addresses work and understand the
syntax or language of the Internet, you will likely not get lost and will save much time and expense during
your adventures. You will also be able to employ shortcuts and workarounds.
Top-Level Domains
When the original ARPANET protocols for communicating among computers were remade into the
current scheme of TCP/IP (Transmission Control Protocol/Internet Protocol) in 1983, the Domain Name
System (DNS) was developed to rationally assign names and addresses to computers linked to the Internet.
Top-level domains (TLDs) were established as categories to accommodate all users of the Internet:
In late 1998, the Internet Corporation for Assigned Names and Numbers (ICANN) was set up to
oversee the technical coordination of the Domain Name System, which allows Internet addresses to be found
by easy-to-remember names instead of one of 4.3 billion individual IP numbers. In late 2000, ICANN
approved seven additional TLDs:
Second-Level Domains
Many second-level domains contain huge numbers of computers and user accounts representing
local, regional, and even international branches as well as various internal business and management
functions. So the Internet addressing scheme provides for subdomains that can contain even more
subdomains. Like a finely carved Russian matryoshka doll, individual workstations live at the epicenter of a
Page 67 of 69
cluster of domains. Within the education (.edu) domain containing hundreds of universities and colleges, for
example, is a second-level domain for Yale University called yale. At that university are many schools and
departments (medicine, engineering, law, business, computer science, and so on), and each of these entities
in turn has departments and possibly subdepartments and many users. These departments operate one or
even several servers for managing traffic to and from the many computers in their group and to the outside
world. At Yale, the server for the Computing and Information Systems Department is named cis. It manages
about 11,000 departmental accounts—so many accounts that a cluster of three subsidiary servers was
installed to deal efficiently with the demand. These subsidiary servers are named minerva, morpheus, and
mercury. Thus, minerva lives in the cis domain, which lives in the yale domain, which lives in the edu
domain. Real people‘s computers are networked to minerva. Other real people are connected to the
morpheus and mercury servers.
Internet Services
To many users, the Internet means the World Wide Web. But the Web is only the latest and most popular of
services available today on the Internet. E-mail; file transfer; discussion groups and newsgroups; real-time
chatting by text, voice, and video; and the ability to log into remote computers are common as well. Internet
services are shown here.
Page 68 of 69
Each Internet service is implemented on an Internet server by dedicated software known as a
daemon. (Actually, daemons only exist on Unix/Linux systems—on other systems, such as Windows, the
services may run as regular applications or background processes.) Daemons are agent programs that run in
the background, waiting to act on requests from the outside. In the case of the Internet, daemons support
protocols such as the Hypertext Transfer Protocol (HTTP) for the World Wide Web, the Post Office
Protocol (POP) for e-mail, or the File Transfer Protocol (FTP) for exchanging files. You have probably
noticed that the first few letters of a Uniform Resource Locator (URL)—for example, http://www
.timestream.com/index.html—notify a server as to which daemon to bring into play to satisfy a request. In
many cases, the daemons for the Web, mail, news, and FTP may run on completely different servers, each
isolated by a security firewall from other servers on a network.
Page 69 of 69