Video Editing (Notes)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

What is H.

264

H.264 is a new video codec standard which can achieve high quality video in
relatively low bitrates. You can think it as the "successor" of the existing formats
(MPEG2, MPEG-4, DivX, XviD, etc.) as it aims in offering similar video quality
in half the size of the formats mentioned before.

Apple Pro res 422

Apple ProRes is a video compression format created by Apple, an intermediate


codec specifically used in Final Cut Pro and other professional editing software
such as Premiere Pro and Avid Media Composer for editing instead of video
playback. As a high quality and high performance editing codec using multicore
processors, Apple ProRes offers multistream and impressive output image
quality as well as low complexity, which leads to excellent real-time editing
performance. It supported frame size ranging from SD and HD to 4K and 2K at
full resolution.

Apple ProRes offers the optimal editing flexibility because all codecs are frame-
independent with the help of variable bitrate (VBR) technology. Apple ProRes is
a 10 bit codec to deal with more color data. If your source video is shot at a 8 bit
codec, you can convert the video files to Apple ProRes to make use of the 10 bit
encoding in order to achieve the color correction in a trouble-free way.

What is compositing in video editing?

Compositing is the combining of visual elements from separate sources into single
images, often to create the illusion that all those elements are parts of the same
scene. Live-action shooting for compositing is variously called "chroma key", "blue
screen", "green screen" and other names.

What is compositing in VFX?

Compositor, Compositing Artist or 'Comper' (for short), works at the end of


the VFXprocess to combine CGI and Digital Matte Paintings with live action plate
photography. The art of VFX compositing is to make all the disparate elements, that
have been created digitally or photographed practically.
Brief description about Steenbeck

Steenbeck is a company that manufactures flatbed editors. Steenbeck is brand


name that has become synonymous with a type of flatbed film editing suite
which is usable with both 16 mm and 35 mm optical sound and magnetic sound
film. The Steenbeck company was founded in 1931 by Wilhelm Steenbeck in
Hamburg, Germany.

Analog vs. Digital

Analog and digital signals are used to transmit information, usually through electric
signals. In both these technologies, the information, such as any audio or video, is
transformed into electric signals. The difference between analog and digital technologies
is that in analog technology, information is translated into electric pulses of varying
amplitude. In digital technology, translation of information is into binary format (zero or
one) where each bit is representative of two distinct amplitudes.

Comparison chart

Analog versus Digital comparison chart

Analog Digital

Signal Analog signal is a continuous signal Digital signals are discrete time signals
which represents physical generated by digital modulation.
measurements.

Waves Denoted by sine waves Denoted by square waves

Representation Uses continuous range of values to Uses discrete or discontinuous values to


represent information represent information

Example Human voice in air, analog electronic Computers, CDs, DVDs, and other
devices. digital electronic devices.

Technology Analog technology records waveforms Samples analog waveforms into a


as they are. limited set of numbers and records
them.

Data Subjected to deterioration by noise Can be noise-immune without


transmissions during transmission and write/read deterioration during transmission and
cycle. write/read cycle.

Response to More likely to get affected reducing Less affected since noise response are
Noise accuracy analog in nature
Analog versus Digital comparison chart

Analog Digital

Flexibility Analog hardware is not flexible. Digital hardware is flexible in


implementation.

Uses Can be used in analog devices only. Best suited for Computing and digital
Best suited for audio and video electronics.
transmission.

Applications Thermometer PCs, PDAs

Bandwidth Analog signal processing can be done There is no guarantee that digital signal
in real time and consumes less processing can be done in real time and
bandwidth. consumes more bandwidth to carry out
the same information.

Memory Stored in the form of wave signal Stored in the form of binary bit

Power Analog instrument draws large power Digital instrument drawS only
negligible power

Cost Low cost and portable Cost is high and not easily portable

Impedance Low High order of 100 megaohm

Errors Analog instruments usually have a Digital instruments are free from
scale which is cramped at lower end observational errors like parallax and
and give considerable observational approximation errors.
errors.

Brief description of Apple Motion software

Motion is a behavior-driven motion graphics application used to create stunning


imaging effects in real time for a wide variety of broadcast, video, and film
projects.

In Motion, you can:

§ Create sophisticated animations on the fly using any of more than 200 built-
in motion and simulation behaviors, such as Spin, Throw, or Orbit, which
allow you to add dynamic motion to your projects in real time, with no
preview rendering time necessary.
§ Build complex visual effects using one or more of nearly 300 filters such as
Glow, Strobe, or Bleach Bypass.

§ Animate the traditional way, using keyframes and modifiable curves, to


create precise timing effects.

§ Create polished text effects, from the simple (lower-thirds and credit rolls)
to the complex (3D titles, animated effects, and sequencing text).

§ Create custom effect, transition, title, and generator templates for use in
Final Cut Pro X. You can also modify the effects, transitions, titles, and
generators that ship with Final Cut Pro.

§ Import and reorient 360° video, then apply effects and integrate titles or
other graphics to create seamless 360° Motion projects or 360° templates for
Final Cut Pro.

§ Use rigging to map multiple parameters to a single control (for example, a


slider that simultaneously manipulates size, color, and rotation of text) in
Motion compositions or in templates exported to Final Cut Pro X.

§ Build compositions by selecting from royalty-free content, such as vector


artwork, animated design elements, and high-resolution images.

§ Retime footage to create high-quality slow-motion or fast-motion effects.

§ Stabilize camera shake or create complex motion-tracking effects such as


match moves and corner-pinning.

Brief description of Sound Trsck pro

Apple's loop-sequencing application has grown up, with the addition of sophisticated
recording, editing and mixing facilities, a powerful waveform editor, and many of Logic 's most
sought-after effects.

The marketing copy for Apple's original Soundtrack claimed that it enabled anyone working in
picture to build professional-quality audio soundtracks by combining the thousands of ready-
made, royalty-free Apple Loops that are included with the program. Apple describe the program
as 'A royalty-free orchestra at your command', but what it really offers is a royalty-free orchestra
plus a royalty-free composer, both of whom will work for no wages, and I'm sure I'm not alone in
feeling a little uncomfortable with this 'even the family pet can play it!' approach. To be fair, you
can also record and import 'real' audio in the form of AIFF or WAV files as well as bringing
in Acid loops, but what it seems to be saying to the musician busy working on TV projects is that
we can cut out the middle man — and the middle man is you!

The new Soundtrack Pro, however, is an entirely different thermal inducing device full of aquatic
lifeforms of the subphylum Vertebrata. Indeed, it is so much more sophisticated than the
original Soundtrack that it has little in common other than the name. Yes, you can still do intuitive
loop-based music compilation, but now you can also edit stereo and multitrack audio to single-
sample accuracy, mix, use third-party plug-in AU effects and restore audio corrupted with clicks,
hiss or hum. You can bring in movies and sync them to your music, while the time-bending tools
built into the program let you change the tempo of the music to fit the video or sync to Final Cut
5 's Scoring Markers.

As you'd expect, Soundtrack Pro is specifically designed to work alongside Final Cut Pro and
other Apple software to facilitate the transferring of video and audio files between applications,
and the multitrack capabilities of the program are ideally suited to layering up and mixing music,
dialogue and effects for use in Final Cut Pro. If you think of the program as combining elements of
programs like Logic, BIAS Peak LE and Garage Band in a way that is familiar to Final Cut users,
you won't be far off the mark, though it also includes some very advanced workflow tools that you
won't find in many other audio programs. You can even use Soundtrack Pro 's waveform editor
section as a stand-alone waveform editor for use with applications like Logic, Live or Reason.
However, there's no MIDI or software instrument part to the program; Soundtrack Pro is more
about editing audio and sync'ing it to picture.

Soundtrack Pro is available as a stand-alone application but also comes as part of the Final Cut
Studio Suite along with Final Cut Pro 5, DVD Studio Pro and Motion 2

Brief History of editing

In the very early days of film, they were made as one continuous shot. Editing came
along quite quickly after that, so that by 1916 DW Griffiths was making sophisticated
films that made use of the idea of changing shot sizes and camera positions.
There is a balance however between using cutting – the juxtaposition of shot against
shot, with Mise en scene. Mise en scene is the placement of action in a scene in
relation to the camera, and also of camera movement itself. This, combined with
editing create the form of the film. A disadvantage of allowing the action to unfold in
front of the camera without cutting is that it can be very slow, things have to happen
in real time, whereas with editing, you can speed up irrelevant detail (e.g. unlocking
the car door prior to a car chase). Also, a cut changes your focus of attention much
more crisply.
The dominant style of editing in narrative films is known as continuity editing – nearly
all Hollywood films conform to this way of putting films together. They are cut so
that shot follows shot in a “believable” sense, and it feels as though events were
occurring seamlessly as the story unfolds. There are many rules and conventions to
follow, but if you watch any TV drama or film on TV and start to look at how it is
constructed, you will get a sense of this. Generally a wide shot is followed by a closer,
mid shot, followed by a close-up. Wide shots tend to stay on screen for longer than
tighter shots. Visually a sequence will contain shots of broadly similar tonal range and
you will always understand where you are spatially in relation to the characters on
screen. A shot change is nearly always accompanied by a 30 degree or more change in
camera angle, but it will stay on the same side of the action.
Continuity editing is not, however, the only style of editing. Ever since editing came
about there have been people who have developed other styles of editing, not
according to the rules of helping a story to progress invisibly, but in terms of using
cuts in either a rhythmic way, or using the graphical content of the images or
contrasting time or location between cuts.
The length of a shot can vary from 1x 25th of a second to the entire length of the film.
How rapidly you cut from one shot to another can depend on the action, or it can
depend on the rhythm you want to create. You can decide to make a cut every
second, or 3 seconds, and set up an arbitrary rhythm, thereby determining the pace
of your film.
Graphical cutting means that you are using the shot as an image – a shape, or series of
patterns, and this determines your criteria for cutting e.g. the dance sequences in
Busby Berkely musicals of 30’s where women dressed up as piano keys, or hi angle
shots of their legs moving were used to make kaleidoscope patterns.
In a conventional narrative scene, you stay in one time and place. You can however,
cut between one location and another, and this, known as cross-cutting or parallel
cutting, can create a powerful sense of the two places being interconnected in some
way. It allows you to join quite separate incidents.
Filmmaking can be described as many things – certainly story telling, but also painting
with time and light. The temporal aspect of editing allows us to control the time of
the editing – it allows us to tell a story spanning 300 years in 1.5 hours, by selecting
the necessary pieces of the story. It also controls the order of events you show.
Some of the most famous proponents of this non-continuity style of editing were a
group of Russians, just after the Russian Revolution of 1917. The social upheaval, and
sense of creating a new world order had a strong effect on all the creative arts of the
time. The Russian film makers worked on the idea of montage – that is cutting
together seemingly disjointed images, to create a deeper level of meaning. They saw
editing as “writing” a film with shots. The editing was the organisation of the film – it
did not merely progress the story. The most famous of all these is Sergei Eisenstein.
He wanted involve his audience in the construction of the meaning of the film, so he
juxtaposed seemingly unrelated images such as shots of starving people with soldiers
and bombs. He wanted to create a collision of meanings, a conflict and synthesis of
opposites. He called this dynamic editing. Of course we are all so used to this now,
that it seems unremarkable, but he was trying to get the audience to make a
connection between the two shots, and to respond to the film on an emotional,
intellectual and perceptual; level. He called this “intellectual montage”.
As well as juxtaposing shots of different places and events, he also did not work to
“real time” In some of his films he would depict the same event time after time. Again,
we are used to seeing that sort of thing now, but traditionally, mainstream narrative
films will show you an event only once, then get on with the story. Eistenstein wanted
to show the event from many different points of view, so a sequence that would take
only a few seconds to occur in life could take several minutes in one of his films.
Perhaps the most famous example of this is the Odessa steps sequence in his film
‘Potemkin’ made in 1925.

You might also like