Making 360 Manual
Making 360 Manual
Making 360 Manual
preface
"THE WELL.
the town may be changed,
but the well cannot be changed.
it neither decreases nor increases
they come and go and draw from the well
structures change, but the life of man with its needs remains
eternally the samethis cannot be changed. life is also inexhaustible.
it grows neither less nor more; it exists for one and for all...
the foundations of human nature are the same in everyone."
we know technology will evolve exponentially, softwares will update, hardware will become smaller, lighter, and more efcient. the chapters we have
written thus far could be irrelevant as of yesterday. so please continually
constantly self-develop with us.
we want this book to be a town well, a reservoir of resources we can all
draw from!
introduction
by Jason Fletcher
Let no one say otherwise, shooting 360 video is difcult and intense! Its
a medium that has completely unique challenges. And that is exciting
for both the tech folk and the storytellers. But youll need to understand
the many hurdles so that you can soar. Knowing the specic details, inherent limitations, and potential problems will only help to inform how to
successfully create immersion. And that is what this book aims to do!
We are going to throw a bunch of information at you. Yet its really up to you
to connect the dots and understand the optimal workow for your specic
camera rig. This isnt your typical DIY book. Really to become adept at 360
video, you will need to perform test shoots and run into problems yourself.
The best way to learn and gain valuable experience is to fail! With that said,
we will equip you with a comprehensive approach.
problems of shooting
equip
The Elements . . . . . . . . . . . . . . . . . . . . . . . . . .
Platonic Rigs . . . . . . . . . . . . . . . . . . . . . . . . . .
Play Your Cards Right . . . . . . . . . . . . . . . . . . . . .
10
13
18
setup
Formatting Cards
World Clock . . . .
Set and Settings .
Get Juiced . . . .
Lifetime Supply . .
Pair to Remote . .
Multiple Rigs . . .
Realtime Preview .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
20
22
24
32
33
34
36
37
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
40
42
43
45
Reference Signals
Hit and Run! . . .
Flashing Lights . .
Stabilization . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
47
49
50
51
plan
Safety Zones . .
Sandbag . . . .
Shoot the Moon
Hide and Seek! .
shoot
Lighting . . . . . . . . .
Misres . . . . . . . . .
Ring Around the Rosey
Beep Code . . . . . . .
Frozen . . . . . . . . . .
On Fire . . . . . . . . .
Modied Fisheye . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
54
58
60
61
62
63
65
Dymaxion Chronole . . . . . . . . . . . . . . . . . . . . . .
Ingestion . . . . . . . . . . . . . . . . . . . . . . . . . . . .
72
73
problems of stitching
import
stitch
Dailies Quickstitch . . . . . .
Color Matching . . . . . . . .
Synchronization . . . . . . .
Background vs Foreground .
Optimization Settings . . . .
Control Points . . . . . . . .
Parallax between Cameras .
One Seam Leads to Another
Surreal Bodies . . . . . . . .
Template Stitch . . . . . . .
Masking Markers . . . . . . .
Stereoscopic 3D 360 . . . .
Circular Crop Factor . . . . .
First Person . . . . . . . . . .
Moving Shots . . . . . . . . .
Patching Nadir . . . . . . . .
6
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
79
88
93
100
105
109
114
119
123
126
130
135
141
144
151
155
Stitch Anything . . . . . . . . . . . . . . . . . . . . . . . . .
Stitch to Desh . . . . . . . . . . . . . . . . . . . . . . . . .
Noisy Footage . . . . . . . . . . . . . . . . . . . . . . . . .
158
160
162
edit
First Assembly
Mise en Place
Rotoscoping .
AE Comping .
Chroma Keying
Color Grading .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
166
170
177
184
187
193
A/B Testing . . . . . . . . . . . . . . . . . . . . . . . . . . .
Hello FFmpeg . . . . . . . . . . . . . . . . . . . . . . . . .
Almost Done! . . . . . . . . . . . . . . . . . . . . . . . . . .
200
206
213
render
height
problems of
shooting
equip
The Elements
Welcome, O life! I go to encounter for the millionth time the reality of
experience and to forge in the smithy of my soul the uncreated conscience
of my race.
- James Joyce, A Portrait of the Artist as a Young Man
Problem:
You want to be the next rst greatest VR storyteller
of all time and space.
You want to create audiovisual immersive experiences. You want to expand
cinema, compassion, and consciousness. You want to explore change.
You want to create new tools for self awareness. You want to help write a
new cinematic language. You want to break open that window of limited
views and climb out right into pure experience. You want to bring the world
one step closer to putting ourselves in each others shoes. Hello, astronaut!
Thats great but what are the rst steps to take you closer? What materials
do you need to shoot, learn, and grow right here right now today?
Solution:
Dive deep in. No fear. Take the rst step. Then the
one after that. Gather all the elements and start
experimenting!
Heres a basic checklist for your journey:
10
have fun!
11
When we say expanded cinema we actually mean expanded consciousness. Expanded cinema does not mean computer lms, video phosphors,
atomic light, or spherical projections. Expanded cinema isnt a movie at
all: like life its a process of becoming, mans ongoing historical drive to
manifest his consciousness outside of his mind, in front of his eyes.
- Gene Youngblood, Expanded Cinema
12
Platonic Rigs
There is geometry in the humming of the strings, there is music in the
spacing of the spheres.
- Pythagoras
Problem:
You need to choose a 360 camera rig from all the
options and congurations available.
The popular 6 camera cube? 7 camera cylindrical layout? 10 camera layout?
Or perhaps 3 cameras with modied sheye lenses? Mono or stereo? What
about spherical or cylindrical? One size does not t all. Dont worry, well
nd the perfect t. Selecting a rig depends on the type of content you are
shooting, environment, distance, moving shots and of course money in
the piggy bank.
Solution:
Prioritize your needs.
MONO vs STEREO
13
SPHERICAL vs CYLINDRICAL
If you have decided to stay mono, there are quite a range of options for you
to choose from that offer high resolution. Again, pick the rig based on the
style and type of content you are shooting. If you are shooting landscape
with minimal subjects, then a cylindrical rig with more cameras around will
offer extra high resolution. There will be more camera coverage around
the horizon. However, because of the limited vertical FOV (eld of view),
there will be a hole at the nadir (oor) or zenith (sky). In other words, there
will a zone where footage is not captured, but this may be okay because
the viewer will not be looking at the sky or oor most of the time. So if
you are shooting for a dome, the nadir hole wont be a problem since the
camera rig will be on a tripod and wont be rendered into the sheye shot.
The sky and oor can also be shot with an extra camera. You can even
use a still camera, such as a Nikon or Canon. Then during the stitching
process x the missing zone and patch in the nadir hole or replace the
tripod.
But a cylindrical rig is not ideal if you have multiple subjects moving around
between cameras. More money, more problems. More cameras, more
seams!
A spherical hemicube rig is an option if you have a smaller budget and
less cameras on hand. There will be equal coverage between the cameras
including the zenith and nadir.
15
FISHEYE vs WIDE-ANGLE
Another option for rigs is to modify the camera with a sheye lens. You
can achieve a greater FOV than wide angle lens and have more coverage
per camera. In effect you will need less cameras for the rig, which allows
the cameras to be closer together and have less parallax. An advantage of
this rig is allowing subjects to get up close to the camera because there
are less cameras and seam lines to break. It also allows for more footage
overlap and can really help to hide any seams during the stitching process.
RECOMMENDED MODELS
MONO
6 hemicube camera rig
10 camera rig
3 camera modied sheye rig
4 camera modied sheye rig
STEREO
12 camera rig
14 camera rig
6 camera modied sheye rig
8 camera modied sheye rig
16
17
Solution:
Sell out and go with the name brand GoPro endorsed and recommended cards.
Use the same make of memory cards for all the cameras. You want all the
cameras as identical as possible so the microSD cards matter as well. Get
the cards with the fastest read/write speeds. The cards with fastest write
speed will perform better in the cameras. Also having the fastest read
speed will minimize le transfer time. Spend more money on the higher
class cards since they will last longer as well.
The SanDisk Extreme PLUS 64 GB or Lexar 633x 64 GB is recommended.
You need the cards with the fastest read/write speeds because when you
shoot with a high resolution video mode on the GoPro cameras, then you
are obviously dumping alot of data onto the memory card! So if you buy a
knockoff memory card and the write speeds arent up to snuff, then the
camera buffer will ll up and will stop recording prematurely.
18
setup
Formatting Cards
N.Z: I suppose your explorations of new media are like swimming in an
endless ocean.
N.J.Paik: A tabula rasa, you know a white paper. Video is a white paper, a
tabula rasa.
Problem:
How do you keep track of all the cameras and tiny
microSD cards?
Be organized! Number your cards as well as cameras. Color code your
cameras if you have multiple rigs. This will prevent headaches and confusion during textbfIngestion and post production. There are all the normal
problems of shooting times x amount of cameras so proceed with extra
care.
Solution:
Blank canvas and tweezers.
Before every shoot, format all your cards or le management will get
really messy. Keep the same microSD card per camera so it is easier to
troubleshoot. For example, if one card has corrupted les, footage that
is out of focus, over exposure, or other problems you can track it down to
the exact camera. Of course, always double check that your footage has
been backed up before formatting.
Formatting the cards through the camera is best instead of on the computer
so the original le structure and partitions are restored.
20
PROTIP: For the non-tiny hands, tweezers are very useful for getting the
microSD cards in and out of the cameras especially when they are in the
rigs and less accessible.
21
World Clock
Problem:
You need a system for le naming convention.
Camera 1 les start at GOPR0001.mp4, Camera 2 at GOPR1234.mp4, Camera 3 at GOPR4747.mp4 and so on.
Solution:
Use Time/Date for naming convention.
Synchronize the clocks for all cameras. This will make le management
and comparing takes much easier later on. Inside each take folder you
can check the details section and conrm that all the videos in that take
start at the same time.
To set the Time/Date, use the menus on the GoPro or connect the camera
to the GoPro app or software and you can manually set the clocks.
Here the clocks were not set. To conrm the videos are all from the same
take, you can check the le size.
22
Here the clocks were set and much easier for the DIT or stitcher to organize
the les.
For the month/day, you can use the day as the camera number and month
if there are multiple rigs.
January 01 - camera 01, rig 1
January 02, camera 02, rig 1
February 04, camera 04, rig 2
The Time/Date can be used as a form of metadata for reference during
the Ingestion and stitching process later on.
PROTIP: Every time you update the rmware for GoPros or if you leave the
battery out for an extended time, the Time/Date will reset. Make sure to
go back and synchronize the clock for all the cameras. Although you will
be able to tell if the les are part of the same take from other details, like
le size, it is easier to pull les across all cameras into a take folder from
the same start time.
23
Problem:
You have to set all the settings on the cameras.
You will have to set each camera manually by hand so decide the default
settings you want to shoot on before changing them. Every camera must
have all the same settings, especially frame rate!
Solution:
Keep it RAW. Match all the cameras. Find the sweet
spot between resolution and frame rate.
You want your settings matched identically across all the cameras. This
will allow them to stitch better and have less Color Matching and balance
to correct in post. Start by deciding the frame rate and aspect ratio. This
depends on the rig you selected. Certain rigs REQUIRE a 4:3 aspect ratio
24
Protune - on
The protune setting should always be kept on. Protune will give you much
higher dynamic image range and overall image quality with more detail in
highlights and shadows. The image will shoot atter for more freedom in
color correction. Protune has higher data rate capture (up to 60 megabits
per second) and less compression, giving you more information to work
with (and less compression artifacts). Having a neutral color prole across
all the cameras will give you more latitude and make it easier to color
balance and correct for a nice stitch.
PROTIP: Turn protune ON rst before you selecting other settings because
25
the settings for resolution and FPS reset when protune is changed.
This keeps the color at but you keep more information for color correcting
and Color Grading during post production.
Resolution/FPS
Next decide your aspect ratio. Depending on which rig you are using,
certain settings must be used for there to be enough overlap between the
seams.
For a hemicube 6 camera rig like the Freedom360, 360 Heros Pro6, or
360Abyss the aspect ratio has to be 4:3 so there is enough overlap in the
seams.
The most recent GoPro HERO 4s now offer:
2704x2028 at 30 FPS
1440x1920 at 80 FPS
26
For cylindrical rigs, the aspect ratio can be 16:9 because each camera will
be closer to the adjacent left/right camera. The 16:9 aspect ratio will offer
enough overlap. Then you can use the 2.7K settings and have a higher
resolution output stitch like 8K.
2704x1520 at 60 FPS
27
This adjusts the cameras sensitivity in low light conditions. Keep it at 400
which will give you darker videos but the least noise and gain.
28
The camera will automatically adjust to changes in exposure when shooting in low light environments. Again, any setting where the cameras are
automatically changing we want off so the cameras stay the closest settings to each other.
29
Sharpness - low
The videos will need to be sharpened during post production for more
clarity and details in the headset. Use the low setting for less processing
on the footage and more data in post.
Range is -2 to +2, in 0.5 step increments. Leave the exposure on 0.0 and
equal on all the cameras. If you have one or two cameras pointing at the
sky, you can bump just those cameras up to +1.0 or +2.0. If you have a
Realtime Preview or eld monitor with you, try out different increments
and adjust the settings accordingly.
30
PROTIP: When using a new rig for a shoot, test the cameras and adjust
settings the day before! Unload the footage and do a test stitch to double
check and make sure the settings are correct and best for that rig. If you
are torn between higher FPS or resolution, do a test and check it out in the
headset before. After you nd the sweet spot, write down the settings and
charge up the batteries for the shoot. Check again on the day of the shoot
to make sure the settings did not accidentally get knocked in transit.
31
Get Juiced
Problem:
You charged the batteries last night but theres only
one bar left all of a sudden!
Prepare extra provisions. GoPro batteries can drain very quickly so be sure
to pack additional spare batteries. Over time they will also be able to hold
less charge.
Solution:
Charge and recharge. Cycle and recycle.
You can charge the batteries inside the camera through USB or with external
battery chargers. Its nice to have multiple dual battery chargers so you can
speed up the process. Also, keeping a system in place to identify between
drained or full batteries is helpful on set. Put gaffers tape on the ones with
full charge to identify between the ones that need to be recharged.
To lengthen the lifespan of your rechargeable batteries, use the batteries
until they drain instead of constantly recharging them. Then they will be
able to charge a full cycle and also last long with each turn.
PROTIP: Even when a GoPro camera is OFF, the WiFi can be ready and
waiting. This is signied by the blinking blue LED light. But if the WiFi is
left ON overnight, then the batteries will lose their charge within 24 hours.
32
Lifetime Supply
Problem:
You want to shoot a glorious long take or beautiful time lapse but the batteries might die halfway
through.
With WiFi on, GoPro battery life lasts around 1:05-1:30 hours for the HERO4
Black depending on which settings you are on. If you are using the WiFi
remote, then the battery will last 0:55-1:40 hours. Shooting a long take is
not recommended for the faint of heart! The cameras may dip out from
battery life, overheating or rmware issues. If you are up for the thrills,
plan to at least be connected to a source.
Solution:
Connect to an external battery source through a
mini USB cable.
If you are in an indoor situation then you can plug into the wall. If you are
trekking through the wild west then bring an external power pack and lots
of water. We recommend Anton Bauer battery packs.
PROTIP: The mini USB connector is extremely delicate so using a right
angled cable might be gentler on the port. However, right angled cables
have the tendency to break easily during travel but will be more convenient
to replace than soldering back together the pieces of the camera connector.
33
Pair to Remote
Problem:
The camera rig is out of reach and you cant manually hit record.
In some situations, you wont be able to manually trigger the cameras. For
example, if the cameras are rigged high up, on a dolly or drone. You can
use a WiFi or Smart remote to easily turn all the cameras in a rig on and
off!
Solution:
Use a WiFi or Smart remote.
The remotes can trigger up to 50 GoPros at once. Before the shoot, pair a
remote to the rig. The WiFi will have to be turned on for each camera. This
will drain the batteries faster so if you are in a situation where you cant
use a Lifetime Supply or constantly charge batteries and backups, then
make sure to save power by turning the WiFi off between takes.
To pair a remote to your GoPro HERO4, turn the camera on and enter the
wireless menu. Select REM CTRL and select NEW pairing. The camera
will be in pairing mode for 3 minutes.
34
Next, turn the remote on and put it into pairing mode. If you have an older
WiFi remote, hold down the red shutter button and press the white power
button to turn the remote on and enter pairing mode. If you have a Smart
remote, turn it on with the power and mode button. Once it shows a WiFi
symbol, hold down the Settings/Tag button to enter pairing mode.
Both the camera and remote should now have two arrows pointing towards
each other in the display. The remote will ask you if you want to pair another
camera. Repeat until you have all your cameras connected.
You will probably want to use this method even if the camera is in reach
so the cameras will roll at approximately the same time. The cameras will
still be a few frames off from each other. If you are on set or have backup
batteries then leaving WiFi on will not be an issue. If you are in the eld
and need to save all the power you can, stick to manually triggering the
cameras.
35
Multiple Rigs
Problem:
You have multiple rigs in the scene and want to
control them individually.
Solution:
Pair one WiFi or Smart remote per rig.
Pair the cameras to a remote and apply per rig. Remember to label or
color code. Now you can control the different heads individually. Once the
cameras in a rig are paired to a remote, they will stay paired to that specic
remote. Label the remote that corresponds to each rig so they do not get
mixed up.
36
Realtime Preview
Problem:
You cant see what you are shooting.
Happy accidents are sometimes welcome, in analog lm they can result in
beautiful emulsions. In VR, shooting blind is not the most ideal and can
result in horrendous unxable seams, dropped cameras, changed camera
settings, etc.
Solution:
Create a realtime stitcher with TouchDesigner for
live preview.
To create your own realtime preview, use TouchDesigners 360 Stitcher
Component. A powerful graphics card and video capture card is required.
The Nvidia GTX 980 is recommended.
To stitch a 360 video, each individual camera is warped and then the edges
of the overlapping images are blended to create a seamless panorama. For
example, if you have 4 sheye cameras, a sheye to spherical conversion
is applied to each camera. The 4 warped images are then edge blended
together to form an equirectangular image. This same process applies
for realtime stitching where the camera inputs are warped and blended
together live. The equirectangular video is then mapped and textured onto
a virtual sphere. The headtracking information from the HMD drives the
rotation of a virtual camera inside the sphere. For an in depth read on
sheye warping, see Paul Bourkes blog post Converting a sheye image
into a panoramic or perspective projection.
The stitcher component will parse a PTGui Pro project le and create the
37
amount of inputs needed and apply the warping. Similar to APG, take a
snapshot of a frame from each of the cameras and create a calibration.
Then instead of applying the calibration to render video frames, the warping
and position will be performed on the video inputs in realtime. Plug your
cameras into the capture card through HDMI. Save your .pts le and load
it into the stitcher component. Connect an Oculus Rift component in
TouchDesigner to output the live feed to a headset.
Congrats! Now you have an on set preview for the director.
38
plan
Safety Zones
And where do you place the viewer in all this?
As someone seated on the bank of the gushing torrent, taking in everything that ows past, the urries of motion and the moments of calm. But
I hope also as someone who plunges into the current, literally bathes in it,
carried away by the ight of their own imagination.
- Hou Hsiao Hsien, Interview on THE ASSASSIN
Problem:
You dont want seam lines, ghosts or broken limbs.
If you dont want siamese twins, meaning weird errors during the stitching
process, then keep subjects out of the hazardous seamline zone. This
will save frustration and hours of keyframing, Rotoscoping, using Masking
Markers and render time during post.
Think like a stage director or magician and block actors for the space.
Solution:
Block subjects to stay within boundaries.
Have the subjects stay in xed areas within a camera. If they must move
between cameras, have them cross a seam line at a further distance from
the rig so they are smaller. The seam will then be less noticeable. Tape,
mark down, and rehearse. Remember to remove the guidelines before
shooting or they will be in the shot! If you have a Realtime Preview or eld
monitor, check to see if subjects are sitting in or crossing a seamline.
40
If you are in the eld and have no control over the environment, then adjust
and turn the camera rig for the least seams on the main subject and action
areas. In VR, you are not chasing a shot, but setting up the rig and letting
the moment ow to you. Follow your intuition and place the rig in a good
position. While you cant frame the shot, you can think spherically and
compose the space.
Clean plates.
If you are shooting a scene where subjects must cross seamlines, record
a take of just the environment. This will give you a clean plate of the background for Rotoscoping and stitching with the Background vs Foreground
approach.
41
Sandbag
Problem:
You are using a monopod but cant hold it without
being in the shot.
Unless you are taking the ultimate 360 sele.
Solution:
Use a sandbag.
If you are shooting on set and have a controlled environment and props,
you can use a monopod. Use parallax to your advantage and x the camera
rig so the monopod falls between two cameras.
For example, if you have the monopod diagonally set over a couch, then
the sandbag will be hidden out of view. The monopod will fall into the
parallax zone and not show up in the stitch.
42
Problem:
You are using a tripod but dont want it in the shot.
Sure you can patch in a logo to cover the tripod legs, but ads are lame.
Solution:
Shoot the nadir or zenith for replacement.
Bring an extra camera to shoot the oor for replacement later. If possible,
you will want this camera to be the same model and settings as the cameras in your rig. Otherwise the camera will have to be warped differently
than the rest of the cameras in the rig. If you are using a cylindrical rig, you
can also capture the sky. The information can then be composited in post
production.
Position the camera as close as possible to the nodal point of the 360 rig.
Point it at the zenith or nadir hole. If you have the camera on a pole, then
you can hold the camera out and have a partner move the camera rig while
you shoot a still or 10 second video. Then place the rig back in the same
position.
43
44
Solution:
Scope out a hiding spot.
Before shooting, plan your exit route and nd a safe space completely
hidden from the view of the camera. Make sure to hide your shadow self
as well. With lighting, you can hide some of your lights in blind spots. Use
a Realtime Preview or do a quickstitch test to check if the equipment is
not in the shot.
45
shoot
Reference Signals
Problem:
You need to synchronize the cameras to each other.
There is no genlock sync on the cameras yet so videos will have to be
synced manually in post. Give yourself or the stitcher as many ways
possible to nd a sync point.
Solution:
Audio Slate
The old fashioned slate. 3 loud claps of your hands. Or any noise with a
fast sharp attack will be easier to sync in post.
Motion Flash
If you have the time and materials, do a motion ash sync in addition
to audio. The speed of light is faster than sound so a ash sync will be
900,000 times more accurate than audio sync.
Some people like to twist the camera rig for motion detection but this
shakes the individual cameras in the custom rigs and is not accurate down
to the exact frame. Use an umbrella over the rig and give it a ash with a
speedlight. In post, you can nd the exact moment where the white from
the ash emerges, down to the exact frames.
47
Even at a high FPS, you can see that the cameras are still fractions out of
sync from each other. This is a problem until there is true genlock/frame
sync. Notice the GoPro rolling shutter effect.
DIY Genlock
MewPro is currently working on a genlock dongle that will allow true genlock syncing for multiple GoPros. The dongle allows frame sync (VSYNC)
as well as scan line sync (HSYNC) but are only available for the Hero 3+
Black currently. MewPro Genlock Dongle uses Arduino Pro Mini 328 3.3V
8MHz.
Learn more about the MewPro Genlock Dongle by Orangkucing Lab.
48
Solution:
Leave the remote at the scene.
Instead of waiting for the WiFi to reconnect with all the cameras, wasting
time and battery power, leave the remote with the camera after starting the
shot. If shooting with a tripod, you can hide the remote under the camera
in the blind spot. After the coast is clear and the take is complete you can
come back to the camera and hit the remote which will still be connected
to all the cameras.
49
Flashing Lights
Problem:
How do you know if all the cameras are rolling?
Look and listen for the signs and signals.
Solution:
Look for the ashing red rey lights.
You can tell if all the cameras were triggered after you start each take by
checking the red LED status indicator lights. They will continue to ash
while they are recording so you can tell if a camera failed or cuts out.
Double triple check the settings on all the cameras before shooting and in
between takes. The cameras sometimes change settings when knocked.
There is a lot happening on set or on the road and things get out of sync.
If an LED is broken, you can check the LCD monitor on the cameras. The
cameras will show the runtime on the take when they are rolling.
50
Stabilization
Problem:
You are shooting a moving shot and need to have
the smoothest stabilization.
The easiest type of 360 video shots are static, with the rig placed in a
xed position. But adding smooth camera movement can add a great
sense of immersion to a shot. After all, as humans we are always moving
and its what we expect when watching media, otherwise it can feel a
bit dull. Dolly and drone shots add exciting movement to the experience.
However, extreme care and caution in stabilizing the shot is needed or the
viewer may get instant motion sickness. Any movement of the camera is
magnied in a VR headset and can cause nausea if not shot properly.
Solution:
The biggest thing to keep in mind is the inherent limitations of your specic
rig. This isnt a typical camera... so using it requires a different frame of
mind. Indeed, not only does the technology have many hurdles but also
the process of shooting a moving shot!
While recording the shot if any actors or objects get too close to the rig, on
average 6 feet, then the parallax seams are going to be very obivious upon
stitching. These seams completely ruin immersion because it suddenly
exposes the magic. Create an imaginary boundary line in your mind and
your shots will be beautiful and your stitching easier.
Each rig is different and youll have to experiment to understand its particular limits of where the parallax is too obvious. Such as for rigs which use
sheye lenses, they have a huge amount of footage overlap so you can
get closer to the rig.
51
Prior to the 360 shoot, think through the camera movements in-depth.
Make sure that everyone involved in the shoot understands where the
imaginary boundary line is and why they shouldnt get too close to the rig.
that the 360 rig will slide down the incline. Be sure to rst test it out with
some object that matches the weight of your 360 rig... The last thing you
want is for your 360 rig to come crashing down!
Manual / Motorized Wheelchair
A manual wheelchair can provide a surprisingly smooth shot. The tricky
thing here is to make sure that you are pushing the wheelchair evenly forward. Its easy to accidentally get some shimmy motion, since its difcult
to make it just move perfectly forward. But this is obviously not a problem
with a motorized wheelchair.
Car, Golf Cart, Airplane
Use powerful suction cups to mount the 360 rig to anywhere. Cinetics
makes a wonderful mount called the CineSquid to do just this.
53
Lighting
Problem:
You want the most optimal lighting conditions for
the shoot.
Lighting is tricky with the GoPros. With low lighting conditions, the image
has a lot of noise. Too much articial lighting causes a variety of problems
such as a blown out image, pollution, and color variation. Lens ares are
also more common with wide angle and sheye lens. If shooting in stereo,
the ares and differences cause a jarring image. Also, where do you hide
the lights??
Solution:
Stay natural.
When shooting outdoors, if you have the time and patience, wait for the
right moment. Try to shoot around dusk when the sun has just set but still
emits a light hue. However, there is a small window to catch the perfect
timing. If shooting at a different time of day, the sun will be pointing directly
into one of the cameras, causing overexposure. You can Shoot the Moon
and try Patching Nadir if it is the camera pointing towards the sky.
For interior shots, do not use too many different articial lights. They will
cause various colors and shadows. Unless you are shooting a J.J. Abrams
style VR piece, be careful with pointing light sources directly at the lens,
creating lens ares.
Be careful not to use too much tungsten lighting or the infrared pollution
will cause a purplish hue and need to be color corrected.
54
Everything will show in a 360 shot and unfortunately you cannot play hide
and seek with lights. Try dressing your set and hiding light sources in blind
spots.
If you go with this lighting option, you should take a moment on set while
lming to review the imagery coming off the downwards facing cameras
to ensure they arent catching any lens ares from the LEDs. If you are
getting ares then you might need to mount a small ag to cut down on
the unwanted stray light.
55
One thing to keep in mind is that since HMI lights use a metal halide gas,
have a very high power demand like 12 kilowatts / 18 kilowatts / or 24
kilowatts that typically require a generator, and get hot quickly, you need
to make sure to get some help from a professional lighting crew person
56
when moving and setting up the lighting gear so you dont have issues or
lose valuable time during a shoot.
Also keep the HMI lights set back a distance from any wooden siding or
ammable objects on the location as the lights and objects nearby can
get warm if the lights are left on for a few hours. Also, when renting gear, it
is good to know that using a modern HMI light xture is much better than
renting an older model as the new HMI light controllers can hot restrike
the arc on the HMI gas so you dont have as much down time between
turning the lights on and off.
Popular HMI lights for on location lighting use are the ARRI ARRISUN, or
Mole Richardsons Daylite Fresnels.
57
Misres
Problem:
Oops! You hit record on one of the cameras by
accident.
Its easy to trigger one or two of the cameras and multiple times throughout
the shoot. This offsets the camera take numbers from each other and
makes it a headache during Ingestion. For example, after take 01 you
triggered camera 1 by accident. After the next take, camera 1 will be at
03 les but the rest of the cameras will say 02. If you triggered different
cameras more than once, things start getting more confusing!
58
Solution:
Play catchup.
If you trigger an individual camera, check the count number on all the
other cameras. Then trigger those cameras for 1-2 seconds to increase
the take count number. For example, if you misred camera 01, then trigger
cameras 02-06 to match the same amount of takes as camera 01 and
all around. If you misred a camera twice, then apply the same method.
During the ingest, the takes will then match across all the cameras and be
easier to separate into take folders.
59
Solution:
Manually trigger the cameras.
Turn the cameras on and hit record one by one. Duck duck goose! Double
check that all the cameras are rolling. If you left the sound indicators on
you will hear a beep for each camera you turn on and also check that all
the LEDs are blinking red.
60
Beep Code
Problem:
The GoPros beeping constantly are driving you insane.
kick
snare
clap
hi hat
beep
Solution:
Welcome the beeps. Or turn them off.
The constant beeping can be quite maddening whenever you are adjusting
the default settings on all the cameras. Instead, reprogram your brain to
receive positive feedback that the cameras are working. Use the beeps as
sound indicators, that a take has begun or a camera has stopped rolling.
Then the beeps only drive everyone else crazy and you will have perfect
memory pitch of 659 Hz, which is an E5.
61
Frozen
Problem:
The camera keeps freezing up!
The camera froze halfway through recording and the LCD monitor is stuck.
Pressing buttons does nothing. Is the camera dead? Is all the footage
lost?!
Solution:
Remove the battery or change the microSD card.
Dont worry, your previous footage will still be there. The take the camera
got stuck on may or may not have recorded up to the moment it failed.
However, all your previous takes should be uncorrupted. GoPros will get
stuck due to software issues or the microSD card.
First try turning the camera off by holding down the MODE button for 10
seconds. If this doesnt work, then take the battery out and put it back
in. If that doesnt solve it, then make sure the camera is OFF and take the
microSD card out to see determine if it is a software or card issue. If the
microSD card doesnt have the proper write speed then it might not be able
to keep up with recording.
62
On Fire
Problem:
The cameras are overheating!
When recording in high performance mode, the cameras start heating up,
using up more power and possibly shutting down.
Solution:
Keep it cool. Chill between long takes.
The HERO4 Black features unique high-performance video modes:
4K30/24 4K24 SuperView
2.7K48/50 2.7K30 SuperView, 2.7K 4:3
1440p80
1080p120/90
960p120
720p120
The cameras naturally consume more power to run these modes and
will increase in temperature. When shooting 360 video, you will most
likely be using one of these modes so give the cameras a break and have
backup cameras if you can. The 360 rigs pack multiple cameras into a
tight formation and the cameras generate even more heat next to each
other.
63
When the camera overheats you will get an indicator that it is shutting
down.
Give the camera some time to cool down and let it rest. It is not recommended to shoot on high performance mode for an extended period of
time and duration. For long takes you will risk both overheating and power
drainage. Again, have the cameras plugged into a Lifetime Supply external
source if attempting super long takes.
PROTIP: Add a tiny heatsink in the area between the lens and the power
button. Heat can be a serious issue, especially if you are recording using
the 4K video mode.
64
Modied Fisheye
Problem:
You want to shoot with sheye lens.
Solution:
Carefully remove and replace the lens.
With modied sheye lens on the GoPros, you have greater coverage per
camera with more overlap. Less cameras are needed in the rig for a full
360 stitch which means less seams and parallax! Subjects can get real
close to a camera without breaking a seam. With more overlap, you can
also shift the seams when rotoscoping or masking. However, since there
is so much more coverage, a lot of the image is in the overlap, resulting in a
lower nal output resolution for the panorama. Shoot on the 2.7K settings
to achieve 4k nal output.
Step one. Prepare your tools. Removing a GoPro lens is a very meticulous
process and you surely dont want to have to tear it out. Here are the tools
you need.
GoPro HERO3 or 4, Lens, Lenscollar, Flat screwdriver, Mighty wrench
65
Step three. Remove the lens outer ring using the screwdriver from the
three different positions. You will be breaking the glue points, pulling out
the outer ring gently.
66
Step four, the most critical step. Remove the lens. The lens has to be
unscrewed from the GoPro lens mount. Use the mighty wrench to hold
the lens strongly while rotating the GoPro body counter-clockwise. Keep
rotating it until you are able to unscrew the lens with your ngers. Use the
heat gun again if needed instead of forcing it.
PROTIP: Since the original lens is also glued from the inside of the lens
mount, use the heat gun for 5-10 seconds over and around the lens.
67
Step ve. Clean up the glue from the lens mount using your screw driver.
Step six. Add the lens collar to your new sheye lens.
Step seven. Insert and screw the new sheye lens to the body of your
GoPro.
68
Step eight. Put the battery in and connect your GoPro to a monitor using
an HDMI mini to HDMI cable.
Step ten. Lock the focus by screwing the lens collar with the at screwdriver
until the lens is locked and unscrewable by hand.
69
height
problems of
stitching
import
Dymaxion Chronole
So, planners, architects, and engineers take the initiative. Go to work, and
above all co-operate and dont hold back on one another or try to gain at the
expense of another. Any success in such lopsidedness will be increasingly
short-lived. These are the synergetic rules that evolution is employing
and trying to make clear to us. They are not man-made laws. They are
the innitely accommodative laws of the intellectual integrity governing
universe.
- Buckminster Fuller, Operating Manual For Spaceship Earth
Problem:
You have devils of details to worry about.
Be organized! Number your cards as well as cameras. Color code your
cameras if you have Multiple Rigs. This will prevent headaches and confusion during Ingestion and post production. There are all the normal
problems of shooting times x amount of cameras so proceed with extra
care.
Solution:
Get a journal or notebook.
Keep track of production notes, takes, the month and date of each new
morning, and share your growth! We hear the phrase new cinematic
language everyday. What does that look and sound like? A single person
alone cant form an entire language, then there would be no dialogue.
We need to all create grammar, words, sentences, poems and build the
structure together. New words arise from a common need. Lets write!
72
Ingestion
Problem:
You want to see how your footage came out but
need to transfer all the data before quickstitching.
Whether you want to check the lighting on set with a quickstitch test or you
are ready to head into post production, copying all the data les is required
before stitching. If you have a Realtime Preview, record the input with a
capture box for a rough Dailies Quickstitch. Otherwise, there is no method
for viewing the takes without ingesting the footage and quickstitching.
Solution:
Ingest Manually.
Each SD card corresponds to a certain camera angle. When you ingest
video les from one SD card, you are uploading all the takes into one folder
(ex. Camera 1, Camera 2). You will need to move the videos from each
camera folder into a new take folder (ex. Take 1). Heres a snapshot of how
it looks before and after.
73
PROTIP: Before selecting the les to move to take folders, batch rename
the les with the camera number as a prex. For example, select the letter
G from Gopro, and rename all les with Cam1_G. In the next camera folder,
you just have to change Cam1_G to Cam2_G.
To quickly nd which video les should be placed into a new take folder,
open all your camera folders using the dropdown arrow. Start by highlighting the rst mp4 in each camera folder, then look at the le size of each
one. If its the same or close in size for all highlighted les, the les are
all from the same take. Drag them all into the new take folder. If you are
unsure, you can always open the videos and view them.
Renaming source les later can be tricky, so organize before stitching. Is
your project stereoscopic or monoscopic? If you shot in stereo, you will
have two of each camera angle, corresponding to left/right eyes. Make
sure to include if the video is Left eye or Right eye in the lename.
The simple saying "for every minute spent organizing, an hour is earned"
truly applies to 360 video editing. Remember you are editing the amount of
take les times the number of cameras. Add a few prexes to help you and
your team down the line such as T01 for take number, HD or SD (4K/2K),
C01 for camera number, LE or RE for Left Eye and Right Eye in the case of
stereoscopic projects.
74
75
One of the major pains is the splitting of longer takes from the GoPros.
When les reach the 4 GB limit and the recording continues, the take will
be split into multiple les that need to be to concatenated. With AVP 2.3,
concatenating les is handled internally during ingestion. You will see
small dropdown arrows for all large sub-sequences that AVP detects.
At the top of the ingester, display your les by sequence. Then check
"merge successive chapters" and "create subdirectories" at the bottom.
Now you are ready to ingest. Enter the path location of your source folder
instead of the desktop. For example LACIE/ProjectName/Source/Video/.
Then "Transfer Selection" and let AVP concatenate and ingest all of your
sequences, also called take folders.
Your les will be merged and renamed with minimum manual organization needed. You may want to batch rename the les in each take
folder and add additional prexes such as LE, RE for stereo footage or
ShotN_SceneN_TakeN.
76
PROTIP: When ingesting using AVP, make sure you have the exact number
of takes in each of your SD cards and all Misres have been deleted. You
may encounter problems with this way of ingesting if you didnt reset the
World Clock on your GoPros. This ingestion method is still in beta, so
double-check everything before transferring the les.
77
stitch
Dailies Quickstitch
Problem:
You need to quickly stitch some source footage
with burnt in timecode for a review session but dont
know where to start.
Youve just nished the Ingestion process and organized your source
footage onto a hard disk after shooting multiple takes for many scenes. Its
now time to sort and label your les into bins. As opposed to the traditional
post-production workow, reviewing your dailies cant happen until your
footage is stitched together. Stitching two or more videos together will
rst require you to organize your les properly.
Solution:
AVP + APG.
Most video camera manufacturers are developing built-in functionality to
ease the stitching/playback of 360 dailies. If you dont have a Realtime
Preview solution, you will have to stitch the videos yourself before previewing dailies. Thanks to Autopano Video Pro (AVP) from Kolor, its just a few
clicks away.
Autopano Video Pro or AVP is the go-to video stitching software and industry standard. It takes at least two videos for the stitching process to occur.
Since stitching videos is an intensive task for most of todays GPUs and
CPUs, AVP has a sister software. Autopano Giga or APG is the advanced
stitching tool for combining multiple images into a panorama.
To stitch multiple videos into one panoramic video, AVP will extract a
frame from each of your videos as a JPG image. These images will be
79
When the stitch quality is not perfect throughout the video, use APG to edit
the stitch manually with the use of Control Points or Masking Markers. In
APG, you will have the option to auto-update your stitch in AVP by saving
your stitch template. The template holds metadata that allows AVP to
stitch your videos based on your adjustments in APG.
PROTIP: Before you start stitching, it is best to check the preferences of
AVP. Under Blend > set Blending Level to 0, Weighting to ISO Cutting, and
under Render Settings > set FPS as original video. Later on, you may also
want to add different presets for your renders, helping you speed up your
own workow.
80
Before jumping onto the stitch tab (fourth icon in the AVP header bar),
select a range of frames by trimming your timeline at the beginning and
end using the blue range selector. Use "I" for IN and "O" for OUT frame with
AVP 2.3. Then click on the exact frame you want for the calibration. Dont
leave it on the beginning frames. You dont want to confuse AVP by trying
to stitch the DPs ngers or face. Save that for later during the ne stitch.
Select a stitching preset using the dropdown. The default preset will auto
stitch as GoPro. If you are using different camera lens, check "Lens model"
and input the focal length and lens type. For example, enter in 8mm for
your focal length and sheye for type of lens. Press "OK", then click "Stitch"
and let AVP do the rest!
81
Select the Blend icon to better optimize the blending of your videos. For
static or landscape shots, try the SMART cutting and you may be impressed
by how the quality of the stitch will improve. For most shots, when the
camera is moving or if you have moving subjects, ISO cutting is recommended.
Rendering is the last step in the workow. Every software you use to edit
the video or audio of a le will let you export the changes by creating a
new video or audio le with the render settings you selected.
Before you start rendering, double check that all your default preferences
are correct. Consider the right FPS for the playback solution of your choosing. Even if you shot at 100 FPS or 60 FPS, you will want to output at an
FPS that the headsets or video player can handle.
82
For example, if you want to upload your 360 video to YouTube or Facebook,
the current allowed FPS is 24, 25 or 30. For quick stitches, set the FPS to
be same "as original video" under the Render settings. Setting the default
preferences will make it easier to batch render.
When you are ready to hit the "render" icon, AVP will bring a pop up of
some presets to choose from and show the maximum output size. The
maximum output size is the resolution achieved from your 360 camera
rig. Depending on the rig you chose, the nal resolution after stitching
can range from 4K to 12K. Presets are very valuable during stitching and
you will want to get familiar with all the choices. When you want to render
small les quickly to test and nd seams to x, you can output at a lower
resolution such as 2K. You can always check at the bottom of the pop up
window what resolution and frame rate the video will render as. For the
Gear VR, render your videos at 3840x1920 or 4096x2048 when shooting
4K (1920x960 is SD).
grading, pixels will get distorted down the line. Distortion occurs within
the range of 10-16 bits.
Starting with Autopano, you will want to work on the highest resolution
les to minimize distortion of colored pixels and keep the full quality when
rendering an 8-10 bit per channel video. Output tiff Frames at 16 bit and no
compression in AVP.
PROTIP: Removing the alpha channel when exporting tiffs will reduce the
size of each tiff. Recommended for large sequences.
Every time you render, you are creating a new le. Stay organized so you
know what version each render is. Add a prex to every le. Use QS for
Quickstitch, a version number _v001 for your tests, and FS for Fine Stitch.
When rendering frames, select an output folder with the sufx _tiff in the
name.
84
PROTIP: On Mac, if your Finder is opened with your video visible, drag
the folder icon into the Terminal window AFTER typing "cd" (e.g. change
directory). On PC, click the folder icon to reveal the path, and paste it in
your Command prompt after "cd".
Type the exact FFmpeg script for the action you want to perform on the
video: embedding a timecode in center of video, at the same frame rate as
video.
Run FFmpeg by simply typing "ffmpeg" in the terminal. FFmpeg takes a
video in and creates a new video out. Lets tell ffmpeg where and which
video you want as input. Just type "-i" and the path/name of your le.
ffmpeg i video.mp4
Type the name for the output le. This FFmpeg script doesnt really perform
85
any action besides renaming the output le. If you want to change the
extension of the output lename to .mov, FFmpeg will operate a conversion
of your video from MP4 to MOV.
ffmpeg i video.mp4 video_tc.mp4
To add any kind of text or timecode on your video, use the lter "drawtext"
after calling it via -vf command before the output, such as:
ffmpeg i video.mp4 vf "drawtext=" video_tc.mp4
Then add the format for the timecode including the frame rate (matching
same FPS as video), font size, color, and position on the video:
timecode=00\:00\:00;00:r=29.97: fontsize=32: fontcolor=white: x=(w)/2:y=(h)/2
Note the colons are required between each argument. Put all of this
together into one command line:
ffmpeg i video.mp4 vf "drawtext=fontle=/Library/Fonts/Arial. ttf :
timecode=00\:00\:00;00:r=29.97:fontsize=32:fontcolor=white:x=(w)/2:y=(h)/2" video_tc.
mp4
Press RETURN after pasting this line into your Terminal and FFmpeg will
render the video again with the timecode on it. Good Job!
If you get an FFmpeg error message of "Drop frame is only allowed with
30000/1001 or 60000/1001 FPS" that means your video clip is using a
non-drop frame based timebase such as 24/25/30/60 fps. To x this
issue, you will have to change the FFmpeg timecode string value to timecode
=00\:00\:00\:00 and adjust the r=29.97 timecode frame rate setting to match
your current video clips frame rate.
86
Note the fontle path on Windows needs to have each of the directory
slashes escaped with a double slash, and the colon in the drive letter needs
to be escaped with a slash as well.
87
Color Matching
Light or luminosity is created by the way elements are juxtaposed. They
become reective and a radiance comes from putting different things together.
- Merce Cunningham
Problem:
One camera is too bright or dark, affecting the overall blending.
Pure white reects 100% of the light, while pure black reects 0% of the
light. Any cameras metering system wants to meter everything as middle
gray, usually around 18% gray. Exposure compensation is a challenge
during production, as it reects 18% of the light that is cast upon it. This is
an even bigger challenge when shooting in 360 degrees.
When correcting exposure of a camera in post production, gure out what
happened in production. Was the shot overexposed? Was white balance
set to auto? Exposure compensation adjusts brightness within the existing
ISO Limit. If brightness has already reached the ISO Limit in a low light
environment, increasing your exposure compensation will not have any
effect.
88
89
Solution:
Read the RGB histogram.
Learn how to read and understand RGB histograms. R G B, red, green, blue,
these 3 primary colors make up your image. Lows, mids and highs are
color ranges that correspond to your low lights, also called shadows, mids
and highlights. The histogram is a representation of the distribution of the
colors (or pixels) in an image.
There are two histograms. The main color histogram shows the red, green
and blue channels (the actual real data) and the one channel combined
value histogram is only a simulated computed value called luminosity. Use
the color histogram or select an individual channel to adjust instead of the
combined histogram.
You can read an overexposed shot by comparing the red, green and blue
channels, and nding one or more spikes in them. A red spike in the
highlights range would mean your shot was overexposed maybe by two
thirds and correcting the levels of reds would help balance all colors in the
image.
90
In AE, bring all the source footage into one composition and align them
horizontally with 5-10% overlap over each other. You can color match or
exposure match all the overlapping areas or edges with this setup. When
stitching, the overlapping areas will then blend much better. The pixel
colors of the edges will be easier for Autopanos algorithm to interpret.
Additional control points can be found by adjusting the gamma on a shot
that is over or under exposed.
91
Apply the plugin Levels on each of your video layers and review the
histogram for every layer. Take note of the spikes, which will help you
understand how to accurately gamma correct.
Adjust the gammas mid level, by .2 points up (to the right) or down (to the
left). Try not to adjust the individual color channels, as this distorts colors
too early in the post production workow.
92
Synchronization
Problem:
The cameras are out of sync, causing a bad stitch.
To stitch a moving or static shot with moving objects or people, you will
encounter magic you didnt expect, such as people disappearing randomly,
or getting shrinked as they cross cameras, or you may think youre seeing
double. Few causes can explain these surprises. Usually it is a sync related
issue. If one or more camera starts shooting with a slight delay, you need
to resync in post.
Solutions:
Use Autopanos built in synchronization.
Synchronizing your videos is the rst step before the footage is ready to
stitch. After dragging your videos into AVP, use the built in synchronization.
This feature only works if there were Reference Signals, like an audio or
motion signal recorded at the start of the take during the shoot. In some
93
situations, there is no audio or visual signal for sync. For example, if you
shot the camera angles at different times, the shooter forgot to audio slate,
the audio on the cameras got dropped, or there was no speedlight for a
motion ash that day, etc. In these extreme cases, manually input the
offsets of the videos needed to be stitched. Find a visual sync frame and
use one camera as an anchor. Look for a frame with fast moving motion,
such as legs running or hands clapping, and match the rest of the cameras.
After dragging your videos into AVP, nd the Synchronization tab and open
it. Select the closest frame in your timeline to a clap or any high peak in
the audio signal.
AVP lets you select the range in seconds for the auto detection to happen,
20 seconds being the good average. Select Use Audio to synchronize
option and click Apply.
The second option Use Motion to synchronize will only work if you used
motion or a speedlight ash during production. Select the nearest frame
and a range for AVP to auto-detect the ash or motion in each of your
videos.
94
Your videos will be processed and placed into a bin. Optionally, you can
rename the created sequence based on your log notes, if you need to sync
multiple takes. Right click and Open the Multicam sequence in the timeline
to see how the video tracks have been synced.
The quickest way to record the sync offsets is to lock the audio les only,
not the video les. Finally, drag all the video les to the last clip on the time
line and Premiere will display the offset by frames. See picture below for
reference.
95
PROTIP: If you are editing your First Assembly with Premiere, it may be a
good idea to update the les/folder names between your quickstitches and
your source cameras. Add a shortcode such as SYNC, QS for Quickstitch,
FS for Fine Stitch, CC for Color Corrected. Rename the Processed Clips
folder to the shot name and include all needed and related assets in the
bin folder.
96
The red line below your cursor will help you see how to move the video
stream to the left or right (forward or backward in the timeline).
After aligning the layers based on the audio peak in the waveform, zoom
in to the timeline for accuracy.
Now you have two options: trim the videos and render only the footage in
sync, or record the sync offset of each video track. Lets trim in this case
and render the new video stream now synced and ready for stitching.
97
For all other video tracks, subtract the start frame of each video track by
the largest offset. For example:
C1: Start Frame = 305; Offset = 305 - 305 = 0
C2: Start Frame = 0; Offset = 305 - 0 = 305
C3: Start Frame = 28; Offset = 305 - 28 = 277
C4: Start Frame = 218; Offset = 305 - 218 = 87
C5: Start Frame = 158; Offset = 305 - 158 = 147
C6: Start Frame = 69; Offset = 305 - 69 = 236
Log the offset of each video track and input them in the Synchronization
98
section of AVP.
Syncing your videos is a basic required step before stitching. Make sure
to double check the sync offsets or you may end up spending hours trying
to x a stitch when it was really a sync issue. AVP makes it easy to sync in
the software, but it is best to manually check the sync offsets are spot on
with an alternative solution.
99
Background vs Foreground
Problem:
While trying to x the stitch, you broke the background by adding control points on subjects close
to the camera.
Most of the time, you will not be able to x all the seams with only one
stitch template.
Autopano automatically extracts a frame from each camera, allowing
you to edit the stitch calibration of the specic frame chosen. When you
update the calibration for one frame, it will update and apply changes to
the entire video. After previewing the video, a seam is still seen by a person
crossing through it. Should you x the person crossing or the seam in the
background?
100
Solutions:
Depending on the rig you chose, parallax can be increased or reduced.
When both foreground and background contain essential objects or subjects, it is necessary to split your work into two stitching phases. Stitch
the background rst. Render. Stitch the foreground second. Render. Comp
them together.
Background approach.
Subjects that are too close cant be xed when stitching the background.
Focus on the distant background. Select the frame with the most seams.
If its a static shot, any frame will do. If its a moving shot, preview the
quickstitch to help select a frame.
Leave the close objects or subjects distorted and focus on the distant
seams. In the advanced settings of the control points editor, move the
slider for Distortion and Offset scopes to Image and select Optimize to
2nd Order from the Distortion dropdown.
101
Save your project and add a version number, for example, T001_BG_1.kava.
Keep different stitch templates in your take folder to help you stay organized and save time for future adjustments. Render your work.
Foreground approach.
There are many creative ways for stitching the foreground, from 3rd order
calibration to ignoring 2 out of 4 cameras. When many actions are happening in different angles, you may even render all cameras separately, without
any blending, to comp over your background later. Foreground stitching
is mostly used for comping purposes. Fixing people or objects that are
closer to the camera rarely renders a good stitch for the background.
102
After changing the settings, remove all control points on the background
from each set of images. Auto-detect control points on the foreground
objects or subjects and Quick optimize. Auto-detect more points on the
foreground. Quick Optimize again. When your RMS value is lower than
4, check the clean bad points from the steps tab and perform a full
optimization.
103
The background will break, as the distortion was adjusted to stitch your
foreground. Save your stitch template and add a version number. Preview
with AVP and x this template until satised. Render your foreground work.
You may need to render in sections with a stitch template per section
xed. Bring the background and foreground renders into AE to perform AE
Comping. Done!
104
Optimization Settings
Problem:
Optimizing quick, advanced and too much.
The optimizer engine of Autopano is by default really smart as its what
quickly stitches your panorama for an initial auto calibration. The problem
solved by the optimizer can be seen as a curve tting problem: given a
curve model (e.g. y=a*X+b) nd the parameters (a and b) that make the
curve t at best to a series of data points. In the context of panorama
stitching, the model is the equation of projection of a 3D scene point to a
2D picture pixel and the parameters are the calibration unknowns and the
orientation of each image.
The optimization settings you may decide to adjust will then affect the
stitch quality, its seams, as well as the RMS value. Often, we think optimizing will solve our problem when in fact it can create additional problems.
Solution:
The RMS Value.
RMS stands for Root Mean Square which in statistics is the square root
mean of the squares of a sample - oh yee! In our context, RMS is a characteristic of a continuously varying function. Think of the RMS as a value
representing the overall quality of the calculations between all control
points found in the overlapping area of two images.
The lower your RMS value is, the better your stitch should be!
105
For the optimizer to calculate the RMS while improving your stitch, it needs
a curve model and the data, the matched control points coming from the
detector. Some points are good points but never perfectly accurate, while
some are completely wrong.
The optimizer will then perform a series of steps to rst t at best all
control points, add a threshold for cleaning bad points, re-estimating the
model parameters and then computing the nal RMS calculation. The nal
RMS value is the mean size of these error segments, but it is not the quality
of the visually stitched panorama.
106
Lets imagine you have a 360 rig with 4 GoPros, and dragged all the videos
into AVP. Under the Control Points tab, check Advanced to see all Optimization settings affecting your control points.
First under Scopes, set both Distortion and Offset to Images scope but
keep Focal to Pano scope as all focal lengths are identical for all cameras.
For a background stitching approach, set Distortion to Optimize 2nd Order
from the dropdown and 3rd order for foreground stitching.
108
Control Points
Problem:
The control points editor has manual and
auto-detection. Which should you use and how will
the RMS be affected?
You may be overwhelmed when launching the control points editor, especially if you are stitching with more than 10 cameras. Control points,
links, RMS, what does all of this mean? Understand that in order to stitch
multiple videos or images together, there needs to have an overlapping
area.
Refer to the Optimization Settings to understand the RMS value. Videos
are stitched or linked together through the use of points that can be added
manually or auto-detected. The points are then cleaned up with Autopanos
optimizer engine. Should you add points manually or auto-detect them?
Solutions:
Simple, fast auto-detection of points.
After importing your videos in, AVP will stitch them based on a lens preset
or a custom focal length and distortion for your lens. AVP will then do an
auto calibration and position the cameras in a 360x180 LatLong format.
AVP stitches the cameras together by auto detecting and generating control
points, the matching pixels between 2 images.
After the initial calibration, you will then be able to edit the stitch template
in APG. Note that APG will auto extract the frame your timeline cursor is on
as a JPG for each of the cameras and then operate its stitching process
on these images. The changes you make to the panorama of this still
109
is the template that AVP applies to the rest of the frames of the videos.
AVP handles the synchronization of videos and applies the APG stitch
calibration of the selected frame to the rest of the video. AVP then spits
out the frames of each camera and renders the applied template to the
selected in and out region.
The rst window of the control points editor will let you adjust the optimization settings and display visualizations of your camera images as a
network of links. Each link has its own RMS value.
The number in the green boxes is the RMS value for every 2 cameras linked,
visually represented with interconnecting yellow lines. RMS is a measure
of error between a point and the current estimation, NOT the ground truth
which is unknown. Below this number is the number of matching control
points between the 2 cameras.
110
First, edit the control points between the two cameras where there is a
clear visible seam. Select the green box linking the two cameras and a
window will popup to let you auto-detect or remove points.
In the CP Editor window, you will see the two cameras and the control
points connecting them together. Use your mouse cursor to draw a rectangle, selecting the overlap region on one of the frames. Then draw a
rectangle selecting the corresponding region on the other frame. APG will
automatically detect control points in the shared rectangular area.
111
Use the Quick Optimize icon at top of the window. Repeat this step as necessary. When satised, check clean Bad Points and Fully Optimize. The
RMS will get updated. Repeat these steps for each relevant link between 2
cameras. Use the PREVIEW area to check the improvements and continue
cleaning points in the CP editor until the stitch is improved.
Stitching using only auto detection of control points will be less time consuming and save time to explore other tools, such as the Masking Markers.
However, you should still understand how to manually add/remove control
points.
After repositioning the cameras, remove all links, and relink at least two
cameras to each other by right clicking one and selecting the second
camera shown in the dropdown. Open up the second window by clicking
on the green box from the two cameras and start by adding control points
manually or auto detecting more control points.
112
In the left area of the window, select another set of two images and draw a
rectangle selecting the overlap regions to auto detect and add new control
points. This will automatically link two new cameras together. Repeat
these steps until all the cameras are linked. Dont forget to optimize the
manual adjustments just made.
113
Two eyes are better than one because they give you two different views of
the world. By combining these two views, your brain can estimate distances
to nearby objects. Try pointing your nger in front of an object. With your
left eye open, align your nger with the reference object in the distant
background. Now open your left eye and close the right one. Your nger
is not aligned anymore with your reference object. This is the infamous
nger experiment to explain parallax. How do you x parallax issues in
Autopano?
114
Solutions:
Parallax creates stitching errors but also creates stitching opportunities,
xing overlapping areas where an object needs to be kept or removed
using Masking Markers, instead of the Patching Nadir method. Parallax
also creates stitching tricks and advantages such as hiding a monopod in
the parallax zone with the Sandbag technique.
Position the marker at the right place. The less amount of markers, the
better.
Head back to AVP and select a frame a few frames after the previous one,
then click Edit again.
In APG, repeat the previous actions but add a keep marker on the other
camera. Remove the previously placed marker. Dont forget to always apply
your changes with the green check icon, and save your stitch template for
each edited keyframe.
117
Preview the masking marker changes in AVP. The masks should have
solved the parallax issue of the subject crossing between one camera to
the other.
This technique works in many cases but not all. You will experience some
strange popping in the background, which is the consequence of forcing
the blending with markers. The transition will be a straight cut in the
timeline. The popping can be reduced by moving the keyframe to the right
frame on your timeline. Finding the right timing is key for this technique.
PROTIP: You may be able to smooth the popping by extending your transition from state 1 to state 2 and by adding one of the curve transitions.
For cases where the subject crossed between cameras at too close a
distance, the masking markers may not help and the popping will be too
obvious. The parallax is even more obvious when the subject is close to the
rig and crosses between two cameras. The chromatic aberration towards
the edges of the lens is also greater, so try to keep the subjects in Safety
Zones and stay within a camera view.
118
Problem:
As you are xing a seam, one or two other seams
start to appear.
You only have one small seam to x before rendering. You make some
quick changes, almost done! You preview just a few seconds and one or
two seams just showed up out of nowhere. Should you have even xed
the small seam in the rst place or should you have previewed the entire
video before xing any seams?
Solutions:
Gotta get them all at once!
To quickly x all the seams, nd most of them in one frame.
119
The best playback for testing is Quicktime or VLC. AVP is great for previewing but not optimal for real time playback at the actual FPS. This will
cause you to miss some seams. Take notes of the frames where seams
require some work while viewing your Dailies Quickstitch.
When you are ready to ne stitch, reopen your previous kava project or
start a new one by dragging your videos into AVP. Select the frame with
the most amount of seams and start by xing all the seams in the frame.
Update AVP by saving the template in APG.
Using the blue range selector in AVP will improve the average quality of
the stitch for the selected range, based on your in and out points. The
Optimizer engine and stitching algorithm will focus on that range, instead
of the beginning where your DPs face is all over each camera, unstitchable!
Fixing all the seams at once makes it easier to prevent new seams to show
up because you already have an overview of all the worst seams. Make a
plan of attack that conquers all the large seams at once. Then the small
seams can be xed with Masking Markers or a simple optimization.
120
121
122
Surreal Bodies
Problem:
When a subject is close between two cameras you
see strange shapes suddenly appearing.
Unless you are trying to create a surreal dreamscape scene with unconscious bodies, most of the time you will want the stitch to be closest to
reality. While the unexpected is always a beautiful mystery, you are looking
for a logical solution to this odd problem.
123
Solutions:
Foreground by subtracting layers.
With Modied Fisheye lenses, subjects can get closer to the cameras
because the FOV of each camera is wide. The subject will be able to get
as close as 1 feet to the camera without breaking a seamline. There is
also more overlap between each camera, allowing you to move the seams
with masking markers. With the extra overlap, there is always more than
enough information for you to ll in or x pixels. The 4 camera rig with
modied sheye lens almost creates a full 360 video with just two of the
cameras, giving you two extra cameras of information. When subjects get
too close, you can uncheck either the odd or even cameras (layers 1 and 3,
or 2 and 4).
There are many advantages with using a sheye lens, such as wider FOV.
However, there will be higher distortion when a subject moves closer towards the camera. When subjects move between seams, there will also
be more parallax because there is more chromatic aberration towards the
124
edge of sheye lens. This will create a more obvious popping effect when
using the Masking Markers.
Check and uncheck some of your cameras in APG by using the group layers
at the bottom of the window. When shooting with 185 sheye lens, there
should still be a full seamless stitch even if you hide two of the cameras.
This is similar to the iZugar Z2X rig, a 2 camera rig with modied sheye
lens. With 4 cameras, there is plenty of extra pixel information for patching
or replacing any problem areas.
Render the best stitch of the two cameras. The panorama may be perfect,
or close to perfect with just some missing pixels in the overlap. AVP lls
the holes with black. To ll the holes, use the information from the other
two cameras. Select the camera layers of the other hidden cameras. Use
the red remove markers to delete extra information you already have.
125
Template Stitch
However vast the darkness, we must supply our own light.
- Stanley Kubrick, 1968 Playboy Interview
Problem:
Your footage is too dark and Autopano cant detect
any control points.
Autopano has a difcult time generating an automatic stitch when all
the pixels are the same. For example, if you shot underwater or in a
room with all white walls, most of the cameras will be blue or white. If
the shot was underexposed, most of the pixels will be dark and muddy.
Autopanos detection algorithm will then have a tough time connecting
links and creating a calibration.
Solutions:
Apply a template.
When stitching videos that are lled with mostly the same color, Autopano
will generate a distorted stitch. Some of the cameras may detect control points, while others may twist and warp in the wrong way. The auto
detection might overlay images on top of each other, treating the similar
colors as control points. You know the exact rig you shot with, so apply a
template from a different shot.
126
Before applying a template, select a smaller range on the AVP timeline for
the auto calibration. Look for a section where there are more objects and
colors for Autopano to detect control points. If the stitch does not improve,
then choose a previous template from another scene that was shot with
the same rig and camera congurations.
After applying the template, you should see your videos stitched into a
nice panorama. However, there will be no control points or links. Under
the control points tab, select Geometry Analysis. APG may detect some
color points now that you have at least applied the warping and underlying
geometry of the camera rig. Remember to optimize any new points found.
127
When using templates for stitching, positioning your cameras will also
affect the detection algorithm. Check the links for any cameras that were
linked incorrectly. Unlink all the cameras and use the move tool > move by
camera and place each individual camera into the correct position.
128
To link one camera to another, use the Geometry Analysis or right click on
the number. In the CP Editor, detect and add new control points by drawing
a rectangle over the overlapping regions. Add one matching control point at
a time manually with the add points tool. Switch to another pair of cameras
by selecting two cameras from the list and nd new control points to link
them. You are right on track again for creating a great stitch!
129
Masking Markers
Problem:
You used the masking tool but after previewing
changes, the objects or people are still there.
After becoming familiar with how Control Points work, explore the other
tools like the masking markers to improve the stitch. Use the red or green
markers to either remove or keep an area on a camera.
Solutions:
Understand the anti-ghost.
The masking tool allows you to select where the anti-ghost acts on in an
overlap region, deciding which of the two cameras has priority. The masking tool does not create content or pixels. Ghosts can only be eliminated
in overlap regions.
130
image. The subject and shadow will be kept in the rst image and the half
cut shadow will be completely removed.
A common mistake is to use the markers like a brush, covering the entire
subject and image with green or red markers. Anti-ghost is a smart, complex algorithm that detects paths in the image. Only a few markers are
needed. If the desired effect is not accomplished, try moving the markers
to a more relevant place.
The masking markers can x visible seams that control points cant. In the
masks section, click the small icon that looks like a Q icon in the bottom
left corner of the stitched image. This helps visualize how the anti-ghost
algorithm is moving the seams.
PROTIP: If your cursor doesnt let you highlight a camera, check or uncheck
camera layers.
The seams will update in real time according to the smart placements of
the masking markers. Click Preview and apply or remove markers until
the preview looks seamless.
133
In the Preview section, you can also test out the alternative blending
options - Smart or ISO cutting. Save your pano le, go to your preview of
AVP and play back from the IN Frame to see the improvements made.
134
Stereoscopic 3D 360
Problem:
Stitching in stereo mode, similar workow squared.
You decided to shoot your scene stereo either on an 8, 12 or maybe 14
camera rig. It will denitely impress viewers by adding depth to your 360
videos, maybe even compete with some computer generated experiences!
Stereo 360 video experiences however, lose half of the potential resolution
as you will need to render your left eye monoscopic video on top of your
right eye monoscopic video, thus creating an Over Under, generally 2300
x 2300, for playback in a VR headset. You know how to stitch a mono
panorama, but how do you approach stereo stitching with two videos? Is
it a similar workow?
Solution:
Most of the critical work to create a stunning stereo experience should
happen during pre-production and production. When you are lming in
stereo, the distance between cameras, their alignments, how subjects were
instructed in relation to the space are all things which cant be corrected
in post without a tremendous amount of work, doubling and sometimes
tripling your original budget. Remember, stereo is not double the work,
but exponential. With every adjustment made to one eye, you have to go
back to the other eye to check the disparity. While stereo can enhance
the experience, done incorrectly, it can create viewing discomfort for the
viewer if the stereo does not converge properly.
folder and not separated in 2 folders, such as a left eye folder and right
eye folder. The stitch template is saved and used for both eyes.
Issues in stereo stitching include the cameras not in being in sync, the
color not matching, and subjects being too close or crossing between
cameras. These problems will all cause the seams to be very apparent and
with subjects too close, the seams may even be unxable. If your subjects
are crossing at a farther distance from the camera, the seams are xable
but the depth isnt as impressive, questioning the need for stereo. Before
you even stitch anything, reduce all the risks involved with stitching in
stereo. Perform Synchronization and Color Matching on all your cameras
outside of Autopano.
With all your cameras in one take folder, and le names reecting Left eye
(LE) or Right eye (RE) camera, synchronize manually with After Effects.
You will be able to sync based on the audio waveforms as well as any
ash or motion signal. Trim all the cameras to only have the available
footage in all cameras perfectly synced by the frame. At the same time you
synchronize the cameras, adjust color matching in the same step. Correct
the mid gamma level slightly on just the needed cameras, but never all of
them since one is used as reference. Re-export all your cameras as mov
lossless les or mp4s.
136
Youve reduced the risks of a bad stitch with these two steps. Import your
videos to Autopano, with same length, same FPS, same format, synced,
and color matched. First, check the stereo tab. Turn on the stereo mode
and assign your cameras to whichever eye. This is why renaming your les
with LE or RE as prexes can be handy. Go to your stitch tab, input your
lens and focal length. Now you are ready to stitch.
In Autopano, start by creating a new group layer located at the lower area
of APG. Then drag all your right eye cameras in the new layer. That way
you can easily switch between left and right eye while stitching.
Next step. Circular Crop Factor. Handle this tool with caution. When the
calibration isnt right, it may be due to your lenses, in particular sheye
137
lenses, and not a control points issue. Edit the circular crop of each of your
camera by rst choosing a radius amount and apply to all images.
Fix the alignment of each camera by pointing the center point on the exact
same pixel for each pair of cameras plus the horizontal offset needed to
create depth. A farther object will need the center point placed almost
exactly on the same pixel while a closer object will have a greater offset
of the center point.
Finally, you are ready to ne stitch. Heres the catch with stereo stitching,
its not double the amount of work but more like squared! Why? Every time
you x a seam, you will need to ensure the seam is xed the same way on
the opposite eye for stereo disparity. If not xed, you may need to go back
138
and x it differently.
An Autopano stitch template holds a great amount of information except
when the computer crashes. Remember to save constantly and save all
the different versions of your stitch template to easily revert to a preferred
stitch version.
139
If you need to combine your videos side by side, left eye at left and right
eye on the right, heres the one line to enter in your Terminal after changing
the lenames to match yours:
ffmpeg i left .mp4 vf "[in] pad=2*iw:ih [ left ]; movie=right.mp4 [right ];[ left ][ right ]
overlay=main_w/2:0 [out]" output.mp4
140
Photographers play with the ultra wide angle effect experimenting with
artistic distortion. For 360 video, sheye lenses valuable to the engineering
of the rig, improving the results of the footage. Each individual camera
has a wider eld of view, increasing the overlap area between cameras.
Less cameras are then needed to complete a full 360 degree stitch, so the
cameras can be spaced closer to each other, reducing parallax. Keep in
mind the nal output resolution of the panorama may decrease with the
extra overlap.
141
If you want to shoot with subjects extremely close to the camera, sheye
lenses are the way to go. However, sheye lenses produce fuzzy edges
around the image circle and capture traces of lens ares or blueish light
around the image. Autopano blends some of these artifacts as well as the
black frame into the stitch sometimes. You may see black or blue blobs in
the blending of the sky or ground.
Solution:
Crop it like its hot
When stitching footage shot with sheye lens, check the Circular Crop tab
and set the image properties. After an initial calibration, open the stitch in
APG. Look for the tool with an image icon and small info i. Click for the
popup.
142
Autopano will then show the frame extracted for the stitching calibration of
each camera. Edit the circular crop area, leaving the black and fuzzy blue
out. Crop only the crisp and clean image area by decreasing the diameter
of the circle. Go through each camera one at a time.
As you can see, every single lens is different, even when they are the same
make and model. Their centers will be just pixels off in the frame. Autopano
will update the blending and the black will not be included in the anti ghost
blending algorithm. The black and blue blobs should now disappear. Use
the Masking Markers to ne tune if there are still traces.
143
First Person
Problem:
You need to stitch a rst person POV mode.
The best rst person POVs can be experienced through few proven programs such as combat training for the military, ight simulations for the
Air Force, virtual driving simulations for tank drivers and remen, surgery
simulations for medical personnel, etc...
First person experiences are truly powerful and will contribute to the future
success of VR. Why recreate reality if you could make your audience dream
on demand? or put themselves in the middle of an adrenaline inducing
heist? in surreal landscapes? What if our schools taught history by recreating the past in VR? or science by submerging into microscopic worlds?
Virtual Reality lets you make the impossible possible. The only limit is
your imagination...and the treatment of looking down.
Solution:
The Treatment of Down.
Should you have a body in a rst person experience? VR makers are all
struggling with the concept of looking down in a 360 video. In VR games,
this is less problematic since you can model a body and script interactions
with a body and arms in a game engine like Unity or Unreal Engine. In VR,
disembodiment is one of the reasons viewers get the motion sickness
you may have heard of or personally experienced. If you are considering
making a rst person experience, what can you possibly display where
your viewers are looking down?
144
Less is More
As explained in Patching Nadir, replacing your equipment or tripod by
clone-stamping patterns in order to recreate a oor provides a great result,
without distraction from the content. For Moving Shots, Shoot the Moon
at the exact same speed as the recorded motion. For stereo experiences,
shoot your plate in stereo, or the transition from stereo to mono oor
may disturb your viewers, taking the focus away from your content. You
can think of the tripod or nadir hole as a limit of technology or instead
as a creative challenge! Think of unusual ways to treat the problem. For
example, a way to embody your viewer in another body without anything
when looking down, could be to composite a plate of your actor facing a
mirror as an intro shot.
145
146
Pre-rendered 3D Model
Try compositing a 3D body model on top of your rendered live action 360
video. Some of the best examples are the Insurgent VR experience by
Kite and Lightning, or Nikes Neymar 360 soccer spot. This approach
seems pretty complex to achieve but with the use of Andrew Hazeldens
Domemaster3D in Maya or Blenders built-in LatLong renderer, any artist
can create a custom model or buy a 3D model and render it out to LatLong
mono or stereo and comp it in AE later.
147
148
Foreground Stitching.
First you should nd the frame with the most amount of body parts in the
foreground, even if the arms are just static on a chair. Edit the frame using
a foreground approach.
Looking at the seams and the control points already detected, start by
cleaning bad points and then remove all points off the background. Once
you have removed the points on the background, nd new points on the
body parts for each pair of camera.
Click on the Optimize button when the clean up of the control points has
been done. To improve the stitch as it probably wont be xed, should
adjust the optimization of the lens distortion since you are stitching a
complex parallax problem. The sensor of the cameras is not perfectly
aligned/centered with the lens and because of that each camera has its
own lens distortion model.
Under the advanced optimization settings, at Scopes, try setting Distortion
to Optimize 3rd order and scope to Image. Then Optimize. The stitching
should be much better. Recreate the leftovers from bad blending, frame by
frame, with AE Comping or Rotoscoping techniques, to attain perfection.
149
150
Moving Shots
Problem:
You need to stitch a moving or dolly shot.
Unless you are into hyperrealist lms with beautiful ultra long takes with
minimal camera movement that reveal the spontaneous manifestations
of daily life, shooting without any camera movement may be boring and
limiting. Most 360 videos place the viewer in a static position and you
want to experiment with the fourth dimension, xyz over time.
In VR, a dolly shot with even the slightest movement of the camera may trigger instant motion sickness and nausea. This occurs when your vestibular
system detects changes in motion and movement through eye input, but
your body does not physically move. Imbalance occurs in the inner ear
resulting in VR sickness.
Until research and development for galvanic vestibular systems for virtual
reality improve, it is up to game developers, content creators, and lmmakers to make the best decisions and choices to reduce motion sickness.
Whether shooting a moving shot on a dolly or drone, make sure to engineer
proper rigging and equipment for Stabilization.
When stitching your moving shot, the seams are more visible from motion.
151
Solution:
After synchronizing and color matching your cameras, start by setting the
blending in your Autopano project to ISO. Smart cutting disables certain
tool analyses to stitch a moving shot.
First stabilize the shot. Autopano has a built in motion stabilization tool
that can be applied to any moving shot. The motion stabilization tool may
take awhile to process the entire shot, so select only the in/out range you
want based on your First Assembly. Make yourself a cocktail while you
wait. Cheers!
152
Second, x the horizon. The stabilization will help smooth the shaking
but will lose the horizon. Correct the horizon after performing the motion
analysis. Use the cutting cursor to edit the horizon, from your IN point to
your OUT point. Then apply your horizon changes section by section.
153
The third step is running RMS Analysis. Whether you used a head mounted
camera rig, monopod, or expensive motorized dolly, always use the RMS
analysis on any type of moving shot. Click RMS, located to the left of Stitch
in the AVP timeline, to start the analysis. Use the cutting cursor to separate
the timeline into sections that are similar visually and in terms of RMS. For
example, if there is an area where the RMS values are high, cut it into a
section. Now re-stitch and optimize each section by selecting its state.
Restart the RMS analysis to see the improvements in updated values.
Lastly, try Patching Nadir for large dollies that covered the ground oor.
Expensive motorized dollies are a great luxury but come with complex
compositing work. A plate that was shot at the exact speed, lighting, and
camera position is needed to cover the dolly. Otherwise, you may have to
track and recreate the oor.
154
Patching Nadir
Problem:
The tripod of the camera rig is visible and masking
markers arent removing it.
You have stitched your videos but the tripod still shows. You didnt Shoot
the Moon to replace the camera rig and Autopano does not recreate missing pixels. When you try using the red Masking Markers to hide the rig,
some weird blending or a black hole is generated in the panorama.
Solution:
Use Photoshops clone stamp to recreate missing
pixels.
Flip the panorama vertically to edit the desired area. Then convert your 2:1
Equirectangular panorama to 180 Fisheye / Domemaster format. After
editing the nadir or zenith in Photoshop, convert back to 2:1 format and
ip if needed to atten the panorama.
First, install Andrew Hazeldens Domemaster Actions for Photoshop.
After importing the image sequence into After Effects, create a new composition with it and save the rst frame as a photoshop layer. Render the
le, and open it in Photoshop.
155
First rotate your image 180 degrees if you are editing the nadir or zenith.
Flatten the layer before applying any domemaster actions.
156
PROTIP: The stamp tool icon is located in the tool box. Press S on the
keyboard to access it, then alt to select area to clone. Click on the area
with the tripod and apply the clone.
Afterwards, atten your layers and apply the reverse domemaster actions,
angular sheye or 180 degree domemaster to 2:1 equirectangular. Then
rotate the panorama 180 degrees if needed.
Back in AE, import the Photoshop layer and place the layer on top of your
existing image sequence. The frame with the tripod removed will be used
for the entire image sequence. Change the dimensions of the photoshop
layer to 50% smaller if needed.
Using the pen tool, hide the layer and create a tight mask around the tripod
from the original sequence. Show the hidden layer you created a mask with
and adjust the feather to blend the edges of your mask with the original
image.
157
Stitch Anything
Problem:
You need to stitch together footage with each angle
shot at a different time.
You shot different angles or different content with the same angle using
one GoPro, DSLR or RED Dragon camera. You cant synchronize using
audio or motion.
Solution:
Assembling ready-to-stitch camera footage.
Shooting with this method is usually for creative, quality, or nancial reasons. Maybe you want to create an interesting or funny scene with the
same subject in every angle performing different actions. You might want
to composite super crisp high resolution cinema quality shots using one
RED camera or you just want to experiment with new ways to stitch and
composite.
More time will be spent on the edit, deciding what should be in each angle
of the experience, visual as well as sonic. To synchronize the takes that
were shot during different occasions, edit your rst assembly with one
video track per angle.
to desh your sheye footage one angle at a time and blend the angles in
AE. You can also create overlapping areas over your source footage in AE,
render each camera and utilize Autopanos blending algorithm.
In this case, there are 4 angles with the same subject in each angle doing
different things. Edit each angle in Premiere just as in the First Assembly
chapter. Trim and place your clips in their respective video track timelines.
Delete all empty or effect spaces between clips and render each video track
for the entire assembly. When rendering your les, add the camera number
for the corresponding angle. Each video should account for an angle in
the eld of view, with its background overlapping with the other cameras.
The key is to stitch the background since the actions are happening fully
within each camera.
159
Stitch to Desh
Problem:
You want the fastest option to preview sheye footage
in a VR headset.
Just like the Dailies Quickstitch, you need to preview every piece of sheye
footage you recorded. Even with the Optics Compensation in AE, the
scaling may be off or the distortion is not handled correctly.
Solution:
Way to go, Autopano!
Autopano will not only recalculate and adjust your lens distortion, but will
also map it with the right scaling into a 360 x 180 LatLong projection that
can be rendered for an instant preview in a VR headset.
Theres a catch though. Autopano cant import only one video because it
is a stitching software, and needs at least two videos to stitch together.
Export a black video from After Effects with the same conguration as
your video (same FPS, size, and length) to desh. Another method is to
duplicate your video and import both together.
160
Autopano will map any video you import onto spherical geometry. Import
your black and original videos and select the focal length and distortion
before stitching. Stitch and click Edit to open APG. Inside APG, uncheck
the layer with the black video or duplicate of your video. AVP will usually
blend your video with the black video, causing the overall lighting to get
darker. Under the Preview tab, set all blending and weighting to None, then
Update. Lastly, launch the Image properties window from the Layers panel
if your video looks too scaled down. Use the circular crop tab to adjust the
area. This should help scale up your video to the 360 space.
Good news! If you are unfamiliar with 3D software like Maya or Blender,
you can create a 3D title in AE and render it out as a lossless mov with a
duplicate or black video. Try doing the same with your title in Autopano to
get the best warping for viewing your title in a VR Headset. Render a tiff
sequence of this title from AVP to your AE assembly without warping.
161
Noisy Footage
Problem:
You shot a night scene and your GoPros are too
noisy.
Shooting with GoPros even when using Protune can not compare to the
quality of a DSLR. Adjusting the ISO, Low Light and EV Comp options will
not guarantee you better nightime shots. It will be took dark and introduce
noise. It could be unstitchable and distracting in a headset. For optimal
conditions for shooting with GoPros, check the Lighting chapter.
Solutions:
Denoise with Neat Video.
Install the Neat Video plugin for AE or Premiere on PC or Mac. The home
version will only denoise footage up to a resolution of 1920x1080 so you
will need to install the Pro version.
Neat Video does a great job at denoising videos and applying it on your
individual cameras before stitching will help nd more control points and
in some cases will permit you to even stitch at all. Below is an example of
stitching 3 images by auto-detecting the points and geometries right from
APG. With Neat Video, some noisy footage can actually be a problem for
APG.
162
Import your individual cameras into AE and create multiple new compositions from les. Find the "Reduce Noise" plugin in AE effects and apply it to
each of your clip layers. Make sure the AE preview is set to Full resolution
and disable the Fast Preview for optimal performance. Click on "prepare"
from the effect controls of Reduce Noise effect. Register your license of
Neat if you havent already. You will auto-create a noise prole by selecting
Auto Prole and then rene the area by moving the blue box over an area
which contains most of the noise and without difference in colors. For
example, place or draw a green box on the noisy sky or ground of your
video.
The noise prole captured from the selected area will be applied to the
whole frame and throughout the duration of your individual clips. Use
the temporal lter to increase the number of frames before and after the
selected frame. Neat will calculate the amount of reduction better for
upcoming frames. Use the spatial lter rarely.
163
Render your videos separately as .mov les or as a tiff sequence that you
can later convert to mp4.
After comparing noisy footage before and after using Neat Video in AVP,
here are the results. AVP found more control points with default autodetection and the noise has been greatly reduced.
164
edit
First Assembly
23. Keep track of every day the date emblazoned in yr morning
- Jack Kerouac, Belief and Technique for Modern Prose
Problem:
You need to assemble a rough cut with multiple
unstitched video streams.
You have just rendered the quickstitches with burnt in timecode and have
to select the best parts for your edit. Should you edit with the source or
the stitched footage? How should you log notes for the best 360 edit?
Solution:
Log notes from reviewing quickstitches.
Whether viewing the dailies with the crew after each day of production or
during the director-editor viewing session in a headset, always log notes
with the 360 space in mind. When auditioning for the best material, consider which camera the viewer will be facing when putting the headset on.
Have your log sheet ready with one row per camera.
166
The log sheet will evolve over the entire 360 editing workow, so make
it clean and beautiful! During ingestion, have the DIT start this sheet by
adding a column for each camera, a row for each take and some notes
such as bad cam, false take, dropped cam, etc. After organizing your
camera les into take folders, update this log sheet and below each take,
add as many rows as the number of cameras.
The goal of the log sheet is to track the INs and OUTs of all your selects, the
cameras that need some exposure correction, the synchronization offsets,
the location of les and all other notes from the team. The log sheet will
be extremely helpful for the stitcher, editor, and director.
to reect the same timecode as the source footage you will be editing
later.
When assembling all clips in your timeline, focus on the timing of the
transitions. Give the viewer enough time to adjust to the new scene. Then
edit all your best clips in the order you desire. When satised with the rst
assembly of the quickstitches, render a low resolution preview of it or start
the next phase, assembly with the source footage.
Assembly with the source footage will require one video track per camera
and should precisely match the rough cut edit of the quickstitches. Make
sure the quickstitches are properly named with the take and camera number. This will make it easy for you to locate the cameras that correspond
to each clip in the timeline. Select all the cameras of each take, and sync
them using the multi camera Synchronization through audio.
Bring the synced sequences of source footage to a new timeline with the
settings matching the camera settings. Trim based on the IN and OUT
points of your log sheet and assemble them like the previously stitched
rst assembly. Its crucial to keep the same settings as the source video to
avoid any compression.
If you shot plates or created titles and other VFX, you can easily add a
video track over the source video track to create the nal result you are
168
The assembly using source footage is not for preview purposes, but for
exporting the EDL or XML le. The EDL le or Edit Decision List is a le that
many editing softwares read in order to recreate the same exact timeline
after relocating the project folder and les. If your individual cameras will
require Color Matching or if the content of your individual cameras were
not shot at the same time, exporting the EDL from the source footage will
help you more than exporting the EDL from the quickstitch assembly.
The purpose of the Dailies Quickstitch is to review and approve the shots
for the rst cut. After the shots are approved, the focus can turn towards
ne stitching the selects. This minimizes render time and also saves time
so the stitcher ne stitches only the takes needed. The most optimal
workow for the stitching post production pipeline is to quickstitch all the
footage, choose selects, then ne stitch the selects.
PROTIP: If the content of your individual cameras were shot at different
times but you are editing quadrants shot from the same camera position
and location, you can render the video tracks separately and stitch it all in
one take.
169
Mise en Place
Problem:
You need to setup your After Effects project after
rendering out 16 bit tiff frames uncompressed from
AVP.
After rendering your stitched panorama(s), there is still work to be done!
You have to hide the tripod, add transitions, effects, color, and titles. Should
you work in Premiere or After Effects? Which shortcuts and pre-comp
settings are the most optimal?
Solutions:
After Effects 16 bit Project.
Every step along the pipeline will process and distort your colors, decreasing the potential for the highest quality picture. Your eyes may not be
able to see, so look at the changes in unique number of colors and RGB
histograms via GIMP. They will be drastically affected depending on your
setup and plugins for effects and color grading.
Working with 16 bit tiffs in an 8 bit AE project will reduce the color information by half at render time, increasing the risk of introducing banding.
Banding in a VR headset is more visible than any other medium. Unless
170
Import the tiff sequences rendered from Autopano and rename them with
the scene and take of the sequence. Create folders matching the names
of your scenes and place the stitched tiffs into the corresponding folders.
Keep the digital workspace clean and comprehensible, implementing a
system with your team. Organize every project with folders separating the
source les from output les. Add a working folder between any step that
includes the work from a specic software. Separate renders from the
work folders into the main Render folder.
171
Organize your AE or Premiere project with scene or take folders that include
the stitched mp4 (not nal as it is compressed) and the tiff sequence
(uncompressed).
After you mask the tripod using the Patching Nadir method in Photoshop,
import the PSD le into AE. This allows you to edit and update any work
in Photoshop back into AE. This also applies to Rotoscoping with Mocha
and editing audio using Audition. Keep all your work organized in the same
take folder in AE.
172
173
Lets precomp the layer and name it Plate. In this precomp, add your
Photoshop layer to patch the nadir and any work to x seams.
Select the Plate precomp from the _Main comp, using Command + D to
duplicate the precomp, then P to change the X axis of the precomp on top.
Add the width of your composition to the x value (ex: 1920 x value + 3840
comp width). Select both precomps together and move them horizontally
to change the center of the viewpoint.
174
175
176
Rotoscoping
Problem:
Your moving subject or object has complex seams
and youve tried everything to x it in Autopano.
Sometimes Control Points and Masking Markers arent enough, and you
need an alternative to deliver a perfectly stitched panorama. If you are
not familiar with Mocha and rotoscoping, now is the time! Rotoscoping
is a technique where a subject either live or animated is traced over,
frame by frame, to create a matte so it may be composited over a different
background or environment. Good news, all versions of AE come with a
free version of Mocha, the tool for rotoscoping!
177
Solution:
Track First. Roto Second.
The process for rotoscoping over a stitched panorama starts with rendering a few different panoramas from Autopano. First, render your base
panorama, the one with the best possible blending and least amount of
seams in the background, as a 16 bit tiff sequence.
Then render only one camera using the same stitch template but without
any blending. In APG, uncheck all layers/cameras, turn blending off, and
check the one with the subject/object to roto. Render it at the same size
as an mp4 and as a tiff sequence.
In AE, import both sequences under their take folder and create a new com178
position from the base tiff sequence, using the same FPS and dimensions.
In the same composition, add the mp4 that contains only the object to roto
on top of the base layer. Under Animation menu, choose Track in Mocha
AE. Save your new project to the take folder location to easily access it
later.
Before tracking our object, ensure you are working on the least amount of
frames to speed up the work. Choose the start and end frame of the roto
area that will be composited to mask the seam on your base panorama. In
Mocha, go to the starting frame and click the Set In Point button located
on the left of the timeline controls. Do the same for the Out point with your
end frame. Select Zoom timeline to In/Out points under Movie menu or
from the same controls.
Now that you are ready to track motion with Mocha, press Command +
L to use the X-Spline tool or Command + B for the Bezier tool. Go to the
end frame of the work area and select a few points around the outline of
the object you want to roto. Theres no need for a detailed shape yet. In
the lower part of Mocha, locate the Motion parameters and select only
the tracking data of Translation, Scale and Rotation. Press Shift + < key to
track backwards. Adjust the X-Spline points from the starting frame, then
179
After the tracking is done, rename this roto layer under the Layer controls
(left area) to track and hide it by unchecking the eye icon. Press Command
+ L or B to start a new roto which will be the detailed roto mask. Take your
time with selecting some of the points and try rounding the shape with the
blue splines. When satised with the mask, uncheck the process icon
next to the eye icon of the Layer controls for the layer you just created.
Then nd the Link to Track dropdown under Layer Properties and select
the previous layer renamed track. You just linked your detailed shape to
the tracked motion of your previous layer. Use the playback controls to
check and verify the motion.
180
This is the bare minimum to understand how tracking and rotoscoping with
Mocha works. When rotoscoping a moving subject with multiple movements happening simultaneously such as head rotation, arms bouncing
around, or legs walking, consider all movements separately. Following the
same steps, create a quick mask with X-splines, track backward, adjust
your points, track forward, and then create your detailed mask to later link
to the tracked layer. The more points, the more time Mocha will take to
track.
181
Back in AE, select your footage layer and place the cursor to the starting
frame, or press I to go to Frame 0. Then under the Edit menu, select Paste
Mocha Shape. AE will create a custom Mask based on the tracked mocha
shape.
182
Go to the frame that has the rst keyframe of the tracked roto and press M
to show the Mask options. Each spline can also be readjusted frame by
frame in AE. After rotoscoping and exporting the object onto your base
panorama, follow the instructions in AE Comping to blend this roto.
183
AE Comping
Problem:
You need to comp two stitched videos warped differently in After Effects.
Compositing in After Effects is its own art form, as many tools can be
used to achieve similar results. The goal is to perfectly blend one footage
over another. AE is an alternative solution if stitching in Autopano is not
enough to correct a few seams or if some external footage needs to be
integrated into your panorama.
Solution:
Using Masks.
Using the pen tool to create a mask over a static shot is the fastest way to
composite an object into your panorama. For moving shots, rst motion
track using a null object and link your mask to it for Rotoscoping with
Mocha. Any green screen footage will benet from Chroma Keying techniques much faster than rotoscoping, but results tend to be a bit rushed
compared to rotoscoping.
Try to create a large mask with points following along the architecture of
the distant background that contains the object or subject to hide unxed
seams in AVP. Sometimes, it is just enough. For moving shots, motion track
184
When satised with your mask and its movement over time (see Rotoscoping for fast tracking), compare RGB histograms between the footage being
comped and the base footage. Levels is the best plugin to correct exposure
in After Effects and has the RGB histogram that is needed. Sometimes it is
enough to bring the mid light up or down to lighten or darken the shot that
needs to be comped. Keep the edge of your mask sharp to better gauge
how much gamma adjustment is needed for a nice blending of the colors.
186
Chroma Keying
Problem:
You need to comp a green screen plate into a stitched
equirectangular video.
Creating VFX that is warped correctly in 360 space can be time intensive
work. You may get away without changing your projection or using different
warping plugins, but the result is not the best.
Solution:
From Autopano to Keylight.
Basic green screen footage needs to be comped in your 360 panorama.
The footage can be lmed with any camera like the RED, Arri, DSLR, or
with the same 360 rig. Shooting with the same 360 rig makes the job a lot
easier. With any green screen footage, use AVP to unwarp and reproject
onto a sphere.
187
Import your footage and click Edit to open APG. Uncheck the unnecessary
video layers to render only the green screen with the right warping for a
LatLong 360x180. If shot with a sheye lens, use the Circular Crop Factor
to scale your video up. You can also set the yaw and pitch of your green
screen layer to 0 to center it and change the FOV value, which should
help scale your footage in the 360x180 projection. Render this as a 16 bit
uncompressed tiff sequence.
Import into AE and add the Keylight 1.2 plugin onto the green screen tiff
sequence. If your green screen was perfectly handled during production,
use the Screen colour color selector to key out the green in your video.
Otherwise, play around with the Screen Matte parameters to adjust the
green keying. You can also increase the greens of your screen by using
the selective color effect.
188
If your keyed object was not properly warped, its safe to place it on the
center horizon of your panorama. Scale it down to t between the rst
row up and down on the proportional grid to minimize the potential for
distortion.
189
190
To animate any 2D, 3D or green screen elements, head over to your plate
composition where you can add your layers to animate. For example, this
eagle has been keyed out and placed on top of the base panorama. The
warping is incorrect when previewing it via the Preview comp.
You need to convert the warping by applying the Mettle SkyBox converter,
located in your AE effects. Drag the effect on top of any layer and SkyBox
191
will convert and unwarp the right way for you to start animating.
All elements have been converted and you can start keyframing and animating just like you would with any other AE project. Use the Preview
comp to check the warping and adjust your animations as necessary.
You can also use SkyBox to convert text elements and create title animations for a LatLong 360x180 projected output.
192
Color Grading
Colour is the keyboard, the eyes are the hammers, the soul is the piano
with many strings. The artist is the hand which plays, touching one key or
another, to cause vibrations in the soul.
- Wassily Kandinsky, Concerning the Spiritual in Art
Problem:
After coloring the blacks, you get more banding.
After sharpening, a cable appears at 180 degrees.
Color grading a 360x180 LatLong is not like coloring any other at video.
Coloring the lows, mids and highs can introduce unwanted banding. How
do you color grade, sharpen and blur as you normally would?
Solution:
The ultimate sophistication. DaVinci Resolve.
Resolve is a color grading platform rst and foremost. Designed to allow
the colorist the ability to quickly correct hundreds of shots while keeping
track of all the grades, Resolve is a great solution to color grade after ne
stitching your footage but also for Color Matching your individual cameras
before stitching. The RGB histogram is much larger than in AE or Premiere,
easing the process for color matching cameras.
193
Youve stitched and edited all your shots nicely with AVP, AE and Premiere.
Now it is time to color grade using the industry standard, Resolve! When
working with many softwares, bringing footage in to one and then another,
also known as the "roundtripping" method, you need to be well organized.
Keep your source footage in the source folder and separate all your work
from different softwares into folders named specically by software. For
example, you would set an Autopano folder where all of your stitching work
happens, a Premiere folder for your edits and XML les, a Resolve folder for
color grading, and maybe even a Maya folder if you plan to add 3D assets
to your project. Most of the work will be data that shouldnt be moved
around and rendering should always target a folder named "Output".
To quickly color grade your GoPros before and/or after stitching, you will
want to use a LUT or Look Up Table. A LUT is a color transform, a set of
numbers that change the colors of an image. They are used to correct log
194
195
Since you shot with Protune on, your individual cameras or stitched panoramas will be at. First, to convert the log footage to Rec.709 space, select
Nodes and right click on your clip to add a 3D LUT.
When coloring 360 videos, some LUTs may correct curves too much. This
means more banding can be introduced. While it is a normal problem
in grading at videos, in 360 videos, banding is an even bigger problem.
A/B test your grades before deciding on the nal color grade. In a VR
headset, banding and colored shadows or blacks are very noticeable and
distract from the reality of the experience. An important Mise en Place is
necessary to avoid banding. Working in a 16 bit color mode is considered
sufcient to render frames in Rec.709 or sRGB.
Apply the rst 3D LUT to convert from log to Rec.709 and then add a new
node (alt + S) to color grade further or apply a 2nd LUT to emulate a lm
look. When the offset is adjusted too much and blacks are colored on the
2nd pass, you can test different transparencies of the LUT look by adjusting
the intensity under the Lumetri Color plugin in Premiere or the opacity of
your LUT adjustment layer in AE.
196
Since the left and right edge of your video have to match perfectly in order
to create a seamless 360 video, you will need to only sharpen certain areas
of the equirectanglar video.
To avoid the visible transparent cable from appearing at 180 degrees from
the center of the video, use masks and feather the edges of the masked
areas. In DaVinci, use the rectangle window and soften the edges of the
rectangle area and then sharpen inside the area.
198
render
A/B Testing
Problem:
You need to compare between renders and decide
what improvements are required.
You can render multiple versions of the stitched video in AVP by switching
.pano templates or adjusting certain settings, such as ISO or Smart blending. Before rendering your nal high quality uncompressed tiff sequence,
do an A/B test of the different render choices by viewing them in a headset.
Seams and errors are more apparent when viewing through a headset.
How do you upload, playback, and compare versions?
Solutions:
Playback on the desktop using Kolor Eyes.
200
The easiest way to quickly preview your stitch is to drop it into Kolor Eyes
player for desktop. Open the application and drag your stitched mp4 or
mov into the Kolor Eyes window. Use your mouse cursor to rotate around
the panorama, checking for seams and any areas that require clean up.
Next, check your Display preferences and make sure to rotate the Rift
display by 90 degrees.
201
Back in Kolor Eyes, you should see the eye or Oculus icon at the bottom of
the window. Click on it and check in the Rift to see if your video is showing.
The Rifts headtracking sensor should now be controlling the orientation in
Kolor Eyes. Put the headset on and check for seams and areas that require
additional attention.
202
203
Under Apps, launch the Oculus app and start the 360 Videos app. Put the
headset on to review. You can also access the 360 Videos app from the
Oculus Home menu.
Tap the touchpad of your Gear VR while pointing the gaze cursor on your
video. Then check for the seams that require your attention.
205
Hello FFmpeg
Problem:
You need a tool for compression.
Whether you are just starting the ingest process and need to concatenate
multiple sequences, want to combine left and right eye into a well scaled
over under, or want to render multiple compression tests, FFmpeg is the
perfect tool for all. At any step of the 360 workow, understanding FFmpeg
will come in handy.
Solutions:
Hello, FFmpeg!
FFmpeg is the leading multimedia framework, able to decode, encode,
transcode, mux, demux, stream, lter and play pretty much anything that
humans and machines have created. It supports the most obscure ancient
formats up to the cutting edge. No matter if they were designed by some
standards committee, the community or a corporation. - FFMPEG
using the search spotlight. In the terminal, you will rst need to show all
the hidden les and folders on your Mac. Enter this line and press return:
defaults write com.apple.nder AppleShowAllFiles YES
Use the force quit menu under the Apple icon to relaunch the Finder. The
hidden les should now appear in your nder window.
Next place the unzipped FFmpeg binary, which should be a 30 MB le, into
the shared user folder containing all your other binaries. Using the nder, Go
to the Computer folder (Macintosh HD unless it was renamed) or Command
+ Shift + C. The folder to place the FFmpeg binary should be located at
usr/local/bin, and if the folder doesnt exist, create it. Authenticate by
inputting your password.
Now, back in your terminal window, use the arrow up key to bring back the
line to show all hidden folders and then replace YES with NO to hide them
again. Relaunch the nder. In the terminal window, type ffmpeg and you
should now see the right version of the binary installed.
Add the following text to the end of the Path variable and then click the OK
button to accept the changes:
; C:\Program Files\ffmpeg\bin
208
Since the FFmpeg bin folder has been added to the systems Environment
PATH variable, you will now be able to run FFmpeg just by typing in "ffmpeg"
or "ffmpeg.exe" in a new Command Prompt window.
Convert Files
FFmpeg is the best tool to quickly convert video and audio les to almost
any format. For example, type this line in the terminal to convert a .mov
le into an .mp4 le:
ffmpeg i video.mov video.mp4
Now lets convert a tiff sequence into an .mp4 le. First replace the logical
order of your tiff numbers with %05d instead of ve zeros 00000. Then an
image sequence will require a frame rate to be converted into a movie le
using the -r option. Try this on a tiff sequence:
ffmpeg i sequencename_%05d.tiff r 25 sequence.mp4
Concatenate Sequences
When recording video with the GoPro HERO4 Black, notice the les are
cut off anytime after 3.8 GB. Depending on the video mode, this could
be 8-11 minutes. The take will be split into chunks. This is because the
microSD cards are formatted FAT32 which limits up to 4 GB. If you are
using a 64 GB card, it will be formatted exFAT which normally allows larger
le sizes. However, the GoPros will still limit the le size to 4 GB. You can
shoot a take as long as you want until the cards ll up or the battery dies.
The individual segments will need to be concatenated into one le before
stitching.
Use FFmpeg to concatenate your sequences. It will be much faster than
using a video softwares rendering engine, like importing into AE and exporting.
Create a le mylist.txt with all the les you want to have concatenated in
209
Note that these can be either relative or absolute paths. Then you can
stream copy or re-encode your les:
ffmpeg f concat i mylist.txt c copy output.mp4
When you need to add multiple audio tracks to a video, for example, for
later use with head tracking, then you will need to understand the -map
option.
ffmpeg i video.mp4 i audio1.mp3 i audio2.mp3 map 0:v map 1:a map 2:a codec
copy output.mp4
210
0:v The 0 refers to the rst input which is video.mp4. The v means "select
video stream type".
0:a:0 The 0 refers to the rst input which is video.mp4. The a means
"select audio stream type". The last 0 refers to the rst audio stream from
this input. If only 0:a is used, then all video streams would be mapped.
1:a The 1 refers to the second input which is audio.mp3. The a means
"select audio stream type".
-codec copy will stream copy (re-mux) instead of encode. If you need a
specic audio codec, you should specify -c:v copy (to keep the video) and
then, for example, -c:a libmp3lame to re-encode the audio stream to MP3.
-shortest will end the output when the shortest input ends.
211
H.264 vs H.265
Apples current preferred compression format, H.264, has been a huge success as being the most exible codec widely used for streaming videos,
capable of handling stereo 3D videos, 48-60 FPS and even 4K resolution.
The problem with H.264 however, is that while it can handle these types
of encodes, it cant do so while simultaneously keeping le sizes low. A
new standard is necessary to push le/stream sizes back down while
driving next-generation adoption, and thats where H.265 comes in. Its designed to utilize substantially less bandwidth thanks to advanced encoding
techniques and a more sophisticated encode/decode model.
In order to obtain a copy of FFmpeg with libx265 support, you need to build
it yourself, adding the enable-libx265 conguration ag, with x265 being
installed on your system. Heres how you would convert and compress a
tiff sequence using the H.265 enabled binary:
H.264:
ffmpeg r 29.97 i sequence_%05d.tiff i audio.mp3 c:v libx264 preset fast maxrate
20000k bufsize 20000k vf scale=3840:1920 pix_fmt yuv420p crf 18 output.mp4
For H.265, options are passed to x265 with the -x265-params argument
such as:
ffmpeg r 29.97 i sequence_%05d.tiff i audio.mp3 c:v libx265 preset fast maxrate
20000k bufsize 20000k vf scale=3840:1920 pix_fmt yuv420p x265params crf=18
output.mp4
212
Almost Done!
Problem:
You are testing playback of the nal delivery and it
is not playing or too large for the device.
There are many platforms and devices your experience can be distributed
on - Oculus Rift, Samsung Gear VR, Google Cardboard, etc. Decide which
platform(s) you want to release on so you can output different formats at
the optimal settings for each. Compression settings depend on the exact
device. Perform multiple compression tests to gauge the best settings for
each device. If you want to release on Android, there will be many different
devices to test. Google Cardboard is the cheapest solution to try VR and
you will most likely want to render a version for it as well.
Solutions:
Know your hard(wear).
The Oculus Rift headset is catered more towards gamers and most consumers will be less hardcore. The most accessible way to watch 360
video experience then is with a smartphone, which everyone already has
in their pocket. A headset like the Samsung Gear VR will still need to be
purchased. Viewers can then mount their Note 4 or Galaxy S6 phone to
the headset and use it as a display. Another option is to build a viewnder
out of cardboard! The Google Cardboard can convert any phone including
iPhone and other android devices into a viewnder.
Most phones cannot handle video les over 500 megabytes. Keep your
video at the highest quality without overheating a phone or taking days to
download.
213
Currently, users are downloading every experience onto their internal and
external phone disk space. For those who love taking photos and videos,
there may not be enough space to store the 360 experiences on the same
device. Find a solution to deliver the experience in a reasonable le size
without completely degrading the quality.
Check playback of every le on every device you will be releasing the
experience on. Make sure to watch the video all the way through. For
example, if you are testing a 7 minute video, it might playback smoothly in
the beginning. However, the phone cannot handle playback 3 minutes in.
There is no way to catch this unless you watched the video start to nish.
Do multiple solid tests for the sake of the amount of time and effort spent
on the production and for the viewer as well, since bad playback will cause
a choppy video which may induce nausea.
PROTIP: Your render may be jittery or not play back on the Gear VR or
Google Cardboard if the resolution exceeds 4096x2048. Gear VR currently
cannot handle more than 30 FPS as well.
Bitrate Analysis.
There are many ways to optimize the size of the nal le with optimal
compression.
The software from Winhoros.de analyzes H.264 encoded mp4s. This tool
is a free bitrate viewer for PC users only. Mac users can potentially use
the Codecian software. Choose your le and let the analyzer run over the
length of the video, frame by frame. After the run through, the analyzer will
show the average bitrate of the video and a graph over time.
Use this tool to preview which sections of your nal le exceed the average
bitrate. The le exceeds the average bitrate when there is an above average
amount of color depth, resulting in a larger le size. To reduce your le
size while keeping overall quality high, compress only the range of frames
that exceeds the average bitrate. You can cut your le size in half by even
recompressing just 100 frames in a 10,000 frame sequence.
214
With the bitrate viewer data, you can easily re-encode your nal tiff sequence in sections. For example, using FFmpeg, compress sequences of
frames around the average bitrate with the -crf option lower and a -maxrate
capped at the average bitrate. For sequences of frames exceeding the
average bitrate from the analysis, compress them with a higher -crf to
lower the quality while keeping your max rate capped at the same average
bitrate.
The result will be multiple mp4s compressed with the best settings. Now
all that is left is to concatenate the les. Analyze the bitrate of the nal
mp4 to conrm the average bitrate remains the same.
215