LightWave 11 Addendum 120412 Small
LightWave 11 Addendum 120412 Small
LightWave 11 Addendum 120412 Small
LightWave™11
Manual Addendum
LightWave™ 11 Features
Contents
Contents
Introduction
Welcome to LightWave 11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Unified Sampling
About Unified Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Overview of Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Additional Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Instancing
About Instancing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Overview of Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Surfacing Instances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Flocking
About Flocking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Using Flocking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Setting up a Simple Flock . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Overview of Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Complex Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Bullet Dynamics
About Bullet Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Overview of Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Bullet Cache . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Bullet and Hierarchies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Fracture
About Fracture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Overview of Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Fracturing Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Fracture in Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Interchange Tools
GoZ™ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
FiberFX Enhancements
About FiberFX Enhancements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
Workflow Enhancements
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Align to Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Layout Modeler Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Connected to Lyrs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Surface Editor Enhancements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Node Editor Procedural Texture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
Searching Nodes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Adding Multiple Nodes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Editing Nodes When Zoomed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Carpaint Improvements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
Fresnel Node . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
NGon Light . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
Render Globals Tab Grouped . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
VPR Enhancements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
VPR Surface Picking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Limited Region Enhancements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
Maximum Render Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
Low-Discrepancy Sampling Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
Print Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Render Buffer Enhancements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
Render Time in Image Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Auto Key Enhancement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Clone Instance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
Faster Access to the Morph Mixer Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
Python Scripting
About Python Scripting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Python Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Writing a Python plug-in for LightWave 11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
Information
Updates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
LightWave Community . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
Copyright and Trademarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
Introduction
Welcome to LightWave 11
We are excited to present this new version with native instancing; flocking; Bullet dynamics;
fracturing and more to add to LightWave’s powerful arsenal of proven technology, together with
improvements to pipeline integration, workflow and existing features.
This addendum manual presents in-depth explanations of how to use the new additions to
LightWave 11. For other areas outside of this addendum, please refer to the LightWave 10
documentation.
LightWave 11 builds on what LightWave has always been good at - offering the best all-round 3D
tool for 3D artists. This release also solidifies LightWave’s ability to fit neatly into pipelines, and we
believe the new additions will bring real benefit to artists working in production and freelance
roles alike. We’re proud of this release and hope you enjoy it!
LightWave 11 - VFX and Animation with a Proven, Award Winning Track Record
Unified Sampling
LightWave™ 11 Features
Unified Sampling
About Unified Sampling
The way that LightWave handles shading, lighting and antialiasing quality has been overhauled
in LightWave 11. Previously these settings were scattered around the user interface, making
changing various settings tedious. More importantly, these settings didn’t work as well together as
they could to produce the best results.
In LightWave 11 these settings now work in tandem with each other and the render engine, and
the render engine now only takes samples where it needs to. This often results in improved quality
at equal or improved render times. It is also now much easier to control the maximum number of
samples LightWave will use. Previously this was not possible and could result in overly long render
times.
A number of changes to the interface have been made as a result of this feature. Here is an
overview of the differences between LightWave and previous versions:
Nodes
Material and Shader nodes no longer have a Samples setting. These settings now reside in the
Render Globals > Render tab under Shading Samples.
Surface Editor
In the Surface Editor > Environment tab, Reflection/Refraction Blurring no longer have a Samples
setting. These settings now reside in the Render Globals > Render tab under Shading Samples.
Light Properties
Lights no longer have Quality or Samples settings. These settings now reside in the Render Globals
> Render tab under Lighting Samples.
Antialiasing Settings
On the Camera Properties panel, you’ll now see a Minimum Samples and Maximum Samples setting.
These replace the previous antialiasing settings, except when using the Classic Camera.
Render Options
Two new settings; Shading Samples and Lighting Samples are now present on this panel. This is
where the global (or unified) settings for all shading and lighting quality settings are located.
Unified Sampling
Overview of Controls
Shading Samples
This affects all nodes that previously had a Samples setting (such as blurry reflections/refraction)
and replaces them with a single, global value.
To shade a pixel, LightWave needs to gather information about it. Shading samples define how
many rays are fired to that pixel to determine its overall shading. Every pixel in the rendered image
is hit by an initial ray fired from the camera; when a ‘hit’ occurs, if you have Shading Samples higher
than 1, more rays are fired. The amount fired is set by the number of Shading Samples, so if you set
it to 4, that many rays are fired. This process is repeated for each pixel if Minimum Samples is higher
than 1, and is repeated as many times as set by the Minimum Samples in Camera Properties.
Lower numbers render faster at the expense of quality or noise in the image, the reverse is true for
higher numbers.
Light Samples
Works in a similar way to shading samples, but for light (or shadow) quality. The lower the number
the faster rendering will be, again at the expense of noisier shadows. Higher numbers result in
smoother soft shadows, but will result in higher render times. This setting is now for all lights.
Tip
In previous versions of LightWave this setting was samples squared, so the default light
quality setting of 4 prior to LightWave 11 would equate to a sample setting of 16 in LightWave 11, worth
bearing in mind if you are loading older scenes.
Unified Sampling
Antialiasing
On the Camera Properties panel, there are now two controls that replace the previous Antialiasing
controls found in LightWave except when using the Classic Camera - Minimum Samples and
Maximum Samples. If Adaptive Sampling is unchecked, Maximum Samples is ghosted and you only
need concentrate on the Minimum Samples value. When using only Minimum Samples, this is
similar to the standard Antialiasing levels in previous versions of LightWave.
Tip
Classic Camera antialiasing remains unchanged. The new settings do not apply.
Minimum Samples
This value determines how many camera rays are generated per pixel. As the name suggests, this
allows you to set how many samples are ALWAYS taken per pixel on the first pass of the image.
When Adaptive Sampling is active, setting a higher minimum value will result in fewer Adaptive
Sampling passes, because the image has already been through initial antialiasing.
Maximum Samples
Available when Adaptive Sampling is on, this allows you to limit the number of samples LightWave
will ever get to during the Adaptive Sampling process. Previously in LightWave it was difficult
to set a limit to this stage of the process, meaning that LightWave could keep on refining in the
Adaptive Sampling stage, even when no more antialiasing was really needed, or was perceptively
visible. This can result in equivalent or improved quality at equal or faster render times than
previous versions.
While the threshold detection has not changed, the way in which additional samples are taken
has. Previously when the Adaptive Sampling Threshold was breached, the Adaptive Sampling
Unified Sampling
routine would be in control of how many more samples it would take; you had no control over
this. In LightWave 11, every time the Adaptive Sampling Threshold is passed it will now take one
more sample. It will keep taking one more sample every time the threshold is hit unless it reaches
the Maximum Samples setting, or there are no more pixels that are selected by Adaptive Sampling
detection for refinement.
The result of this means that lowering the Adaptive Sampling Threshold to resolve finer noise in an
image does not mean it will result in the expensive render times of previous versions, due to the
render engine not taking the same amount of samples at that stage. Render times will clearly be
increased, but what’s important to understand is that before in LightWave, a setting of 0.01 would
be as extreme as most users would be willing to go; now you can set it to 0.005 or 0.001 along with
an appropriate Maximum Samples setting to achieve equivalent or in many cases higher quality
than previous versions of LightWave, without the same time penalty this once meant.
As with previous versions of LightWave, finding a good balance between render quality and speed
is very scene dependent. You may need to experiment with the Maximum Samples and Adaptive
Sampling Threshold on a smaller area of your scene using the Limited Region render.
Additional Notes
Tip
Old scenes loaded into LightWave 11 are not aware of the new Min Samples and Max
Samples settings, and therefore will be set to their defaults. This could mean that your old scene
appears to render either slower or faster but noisy. You will need to adjust the values to suit your scene.
Due to changes in the way samples are generated, the ‘Rays Per Evaluation’ and ‘Secondary Bounce
Rays’ in the Global Illumination panel are affected. This means in order to get similar results to previous
versions of LightWave, you will need to increase these values, often doubling them. It won’t take any
extra render time, but is needed to account for the differences in number of GI samples generated with
the new system.
When the ‘Use Gradient’ option is checked in the Global Illumination panel LightWave uses the old
sampling method for Global Illumination only. When off then it uses the new sampling.
Instancing
LightWave™ 11 Features
Instancing
About Instancing
Instancing allows vast duplication of objects in a scene with very little overhead. With instancing
huge ‘virtual’ polygon counts can be achieved allowing the artist to populate their scenes with
incredible detail, yet retain reasonable render times and memory usage.
While instances can be thought of as clones of the original source objects, they do not need to look
identical. They can be randomly scaled, positioned, rotated and even surfaced entirely differently
from the source, allowing for a huge variety of uses.
Instancing capabilities have now been added to Layout. The Instancer Tab on Layout’s Object
Properties panel is where the action happens. The selected object in the Object Properties panel
will be the point source for the Instance Generator, once it has been added to the source object.
Several instancing distribution methods are provided.
Instances can be displayed in OpenGL as normal objects, bounding boxes, as points, or not at all, as
the user chooses and are fully supported in VPR.
Instances work with Classic, Dithered and Photoreal Motion Blur and Surfaces can be varied among
the instances.
Tip
Instances work with all LightWave’s built-in cameras apart from the Classic camera.
Instancing
There are several types of instancing for the user to choose from. Most of these provide additional
dedicated controls when activated. In addition, each instance type can use the options found
under the tabs at the bottom right of the panel to control the distribution and range limits of
various attributes for instances. Nodes are available to allow for varying the surfaces among
instances, and controlling other attributes as well. First we will list the types and their dedicated
controls, then we will cover the controls in the tabs below.
Overview of Controls
Instancing
Edit Popup Menu
• Remove Selected - Removes the selected object from the Instanced Objects Window.
• Remove All - Removes all objects from the Instanced Objects Window.
• Replace Object - Pops up a list of the objects in the scene, and the user can pick one to
replace the object currently selected in the Instanced Objects Window.
• Add from Layout - Adds selected object or objects to the Instanced Objects Window.
• Remove from Layout - Deletes selected object or objects from the Instanced Objects Window.
• Replace From Layout - Swaps one item selected in Layout for one item selected in the
Instanced Objects Window.
Lists the instanced objects and provides controls for several basic attributes.
Instancing
• Hierarchy (Checkmark) - Selects whether a hierarchy attached to the source item should also
be instanced.
• Off - Attached hierarchy is not instanced (default option).
• On - Attached hierarchy is instanced.
• Item - Name of the object(s) used as an instance source.
• Weight - The weight property can be used for limiting (turning off ) the number of instances
being shown, despite how many instances you have set. It does this by determining an
approximate amount of probability of an item being shown, using the value you set. You
can use a simple value, or the Texture Editor for more advanced weighting. This property can
also be animated, so you can have instances appear/disappear over time.
Tip
The weight is not the percentage of the maximum number of instances for each item, but is
the approximate probability as a percentage of that item being shown.
Also note, the weight setting is per item, whereas the other settings are for all items in the instance list.
Instancing
Weight Examples
Imagine we have a scene with only one item being instanced across a plane, with the max number
of instances set to 100. At a 100% weight (or probability) it would clearly show all 100 instances.
However, this setting can be used to limit the number of instances; if we set the weight 50%, you
would only see approximately 50 instances, because they have a 50% chance of being shown.
1 Item, Weight: 100% (100 Instances out of 100) 1 Item, Weight: 50% (53 Instances out of 100)
If you count the number of instances in the 50% weight image above, you’ll notice there are 53
instances and not 50 as you might expect. To explain why this is the case, we need to add another
item into the mix.
2 Items, Weights: 100% (100 Instances out of 100) 2 Items, Weights: 50% (53 Instances out of 100)
Two items being instanced at 100% weight will result in 100 total instances, but with each
item approximately equal in number (both have 100% probability of appearing, so they are
approximately equal in number). If we set both items to 50% weight, they now have 50%
probability of being shown, however, we won’t see 100 instances as you might think, but 53 as
before. This might not make sense at first, but the key word here is probability, so let’s try and
explain.
Instancing
The instancer has been set to have a maximum number of 100 instances. Let’s imagine that
number as a list of 100 ‘instance slots’, that are ‘drawn’ in order from 1 to 100 (this is not how it
works internally, but will help explain the concept).
1 2 3 4 5 99 100
To achieve a random distribution when we have more than one item to instance, that ‘list of
instances’ also needs to be filled randomly, so as the instancer draws each ‘slot’ in order, we see the
instances scattered in a random fashion rather than ordered in groups of each instance item.
1 2 3 4 5 99 100
However, creating a randomly ordered list of items to be instanced is not very efficient, especially
when you have the potential to have thousands or even millions of instances. This is why the
instancer does not work this way. Instead, it uses the probability of an instance being shown, or
the weight value.
In reality, the instancer goes through the number of instances set, looking at the items in the
instancer list and working out the probability to be shown for each item at that instance number.
It determines this by a random percent value, if that value is less than the weight amount for the
item, then the instance number it’s on is shown.
Due to the random number generation, the probability that an instance will be visible can vary a
little, which is why there are 53 instances in this case and not exactly 50, but doing it this way is
much faster than building and storing a pre-determined list of random numbers.
Tip
To keep things easier to understand, if you have multiple items at different weights, try to
ensure the weight amounts add up to 100%, or leave them all at 100% if you want the instancer to
determine the mix.
While you can enter values above 100% for any item’s weight value, internally all items’ weight values
will be ‘scaled’ by whatever factor was needed to bring the largest value down to 100% (normalized).
Instancing
• Time Scale - Allows any instanced hierarchies to be animated slower or faster than the
original object hierarchy that was instanced.
• Time Offset - Sets the animation offset (in seconds) of any instanced hierarchies from the
original object hierarchy that was instanced.
Generation Tab
The controls in this tab are where you set the type of generation for instances. The types are:
• Item (Pivot) - Creates an instance at the pivot point of the host object. A usual use case for
this would be to use a null to create a clone of a scene item.
• Particles - Creates instances at the locations of particles (for use with particle emitters only).
• Surface - Creates instances on all surfaces or a designated surface of the host object.
• Instances - Set the number of instances to create.
• Surface Name - Select the surface(s) on which the instances will be placed.
• Distribution :
• Random - Places the instances randomly on the selected surface(s).
• Uniform - Place instances at uniform distances on the selected surface(s).
Instancing
• Relax - (Active when Uniform is selected) relaxes the distribution.
• Radial Array - Creates instances in a 2D circular array on the selected axis around the pivot
point of the host object.
• Instances - Set the number of instances to create.
• Axis - Set the axis for the array.
• Radius - Set the radius of the array.
• Start Angle - Set the starting angle of the array on the circle.
• End Angle - Set the ending angle of the array on the circle.
• Offset - Sets an offset distance perpendicular to the axis chosen between the instance
at the start angle and the instance at the end angle. The instances between will be
proportionally offset.
• Motion Path - Creates instances along the motion path of the host object.
Giving a Null a motion path and then assigning it an Instance Generator allows this chain of Layout-created cubes to follow the Null’s
motion path with a great deal of control over exactly how they do.
Instancing
• Instances - Set the number of instances to create.
• Separation - Time in seconds between each instance.
• Time Offset - This slips the instances forward and backward along the motion path,
and again is in seconds. If the value was 1 , then the instances would appear 1 second
further along the motion path. Negative values slip the instances further back.
• Lock Motion to Time - This makes the instances move along with the motion object. You
can use this in conjunction with Time Offset to make the instances move along with the
object, but further ahead or behind in time.
• Scale Motion Time - Scales the speed that time passes when Lock Motion to Time is
on. For example, 0.5 would move the instances at half the speed of the motion object,
while 2.0 would move them twice as fast.
• Follow Motion Object - Instances are generated from the motion object forward in time,
or backward; this control determines the behavior. If Follow Motion Object is on, the first
instance is generated at the motion object’s position and the rest trail backward down
the path. Otherwise the instances are generated ahead of the motion object with the
final instance generated at the motion object’s position.
• Separation Jitter - Jitters the Separation randomly so the instances are not uniformly
distributed along the path. A value of 1 would jitter randomly from 0 to the separation
value in either direction.
• Offset Frequency - Setting is in seconds; randomly jitters the position of the instances as
they move down the path, like a simple flocking behavior. The positions change every
second of the value of Offset Frequency, so for a value of 2 the instances should change
position every two seconds.
• Use Local Offset - The offset values in the Offset tab can be either applied in world or
local coordinates relative to the alignment of the instance on the path.
• Scale Positions Only - If the motion object is scaled as it moves, the instances can be
scaled along with it, or just their positions can be scaled, to make them spread out from
the path or move towards it.
Options Tab
OpenGL Display - To allow for faster interaction when working with a large number of instances,
you can set the percentage of instances to be visible in the viewport. This does not affect final
rendering, but will make VPR more responsive by rendering with a lowered amount of instances.
Instancing
Random Seed - This value is what instancing uses as a base to generate a random number.
Changing this will vary any settings using ‘random’ values e.g. random scale. Every Instance
Generator will have its own unique number, but if you wish different instance generators to use the
same random state, you can set the same number here. Useful if you want items in more than one
instance generator to be aligned.
Scale, Stretch, Offset and Nodes are identical for each type of instancing. Scale, Stretch, and Offset
each provide two modes, Uniform and Random, for handling of the distribution of the attribute.
Scale Tab
Allows for varying the size of the instances. At the top of this tab are two functions that need
additional explanation.
• Scale by Polygon Area - This compares the size of the polygons that will be used to position
instances with the size of the original object being instanced. It also controls the amount of
influence the original object’s scaling has on the Neutral Scale Factor.
• Neutral Scale Factor - This factors what size your instances will be for the polygon area they
are being instanced on. Hence a Neutral Scale Factor of 2 will mean that two instances will fit
inside the area of the polygon.
Both of these settings function in conjunction with the other scaling and stretching options
available to you, increasing the flexibility available.
Instancing
Mode :
• Uniform - All instances will be scaled at the same size.
• Scale - Sets the size by percentage.
• Random - Creates instances at varying scales based on the min/max settings.
• Min - Smallest size for an instance, as a percentage.
• Max - Largest size for an instance, as a percentage.
Stretch Tab
Allows for varying the proportions of the instances along each axis (X/Y/Z).
Mode:
• Uniform - All instances will have identical proportions.
• Stretch X/Y/Z - Settings for the percentage of change to make in each axis.
• Random - Creates instances at varying proportions based on the min/max settings.
• Min X/Y/Z - Sets a minimum size in each axis.
• Max X/Y/Z - Sets a maximum size in each axis.
Offset Tab
Instancing
Allows for an offset from the initial position the instance was created.
Modes :
• Uniform - All instances are offset the same distance on each axis.
• Offset X/Y/Z - Settings for the offset distance for each axis.
• Random - Instances are offset at varying distances based on the min/max settings.
• Min X/Y/Z - Sets the minimum value for the distance of the offset.
• Max X/Y/Z - Sets the maximum value for the distance of the offset.
Rotation Tab
Allows for controlling the rotations of the instances in a variety of ways. Some of these options will
be determined by the type of generation used.
Instancing
• Rectangular Array :
• Item - Instances align with the host item.
• Radial Array :
• Item - Instances align with the host item.
• Rim - All instances are oriented to the pivot point of the host item.
Motion Path :
• Motion Item - Instances align with the motion item.
• Path - Instances align heading to the direction of travel along the path.
Mode :
• Uniform - All instances have the same orientation.
• Rotation H/P/B - Entry fields to set Heading, Pitch and Bank.
• Random :
• Min H/P/B - Set minimum rotations for Heading, Pitch and Bank.
• Max H/P/B - Set maximum rotations for Heading, Pitch and Bank.
• Target Item - Instances orient to the pivot point of a target item.
• Target Item - Pick the target from the list of scene items, including lights and cameras.
• Target Mode - Changes how targeting is calculated.
• Look At - Instances all look at the pivot point of the target item.
• Axis Aligned - Instances all align with an axis of the target item.
• Axis X/Y/Z - Active when Axis Aligned is the target mode; selects the axis of the
target object to align the instances with.
Nodes Tab
Access to the Node Editor for controlling the transformation and weighting of instances using
nodes. Node flows can also be turned on/off without needing to delete or hook them.
Instancing
Instance Generator Node
This works in conjunction with the settings in the transformation tabs on the panel, and allows
node-based control of these items, as well as of the weighting of the instanced items. Any settings
not controlled by nodes will use the panel settings, this allows you to mix node-based control with
those on the main instance generator panel.
• Scale
• Scale Min/Max
• Stretch
• Stretch Min/Max
• Rotation
• Rotation Min/Max
• Offset
• Offset Min/Max
• Weighting (a discrete input for each weight map)
Tip
The node transformation inputs are affected by whatever mode you have chosen in the main
panel for the corresponding transformation, for example, if you had a node flow controlling Scale Min/
Scale Max, but in the instance generator panel you have the Scale Mode set to Uniform, then the node
flow will not influence anything, as you have chosen a different Scale Mode to what the node flow is
controlling.
Instancing
Example
For the animated abstract sphere above, we used a Turbulence procedural texture to drive the
uniform Scale of our instanced cubes. We also set the Stretching to be Random and converted the
same Alpha output from Turbulence into a Vector to drive the Stretch Max value to exaggerate the
effect.
Instancing
Surfacing Instances
Instances can be uniquely shaded using Nodes in the Surface Editor, whether just a simple color
change, or completely different materials. This section deals with the options you have at your
disposal.
Instancing
Random Colors (‘Fixed Random’ Output)
Creating random coloring of instances is done using the Instance Info node, connecting the Fixed
Random output into a Gradient node, and then into any color channel. Setting keys at 0 and 1 on
the gradient represents the total number of instances. An easy way to understand this is to think of
key 0 as being the first instance, and key 1 being the last. Every instance in between is then given
a randomly picked color from the 0 - 1 gradient range. This is useful so that even if you change the
number of instances in the Instance Generator, all instances will be included in the 0 - 1 range.
The colors are picked at render time, and remain associated with the instance they were assigned
to for the life of the instance. However if you change the number of instances, then the coloring
may change as there are now more instances to consider.
Instance ID #
0 1 2 3 4 5 6 7 8 9
0 1
Gradient Keys
You can of course add as many gradient keys as you like to introduce more random colors
Instance ID #
0 1 2 3 4 5 6 7 8 9
Hermite / Bezier
0 1
Gradient Keys
Instancing
• Gradient > Gradient
3) Connect Fixed Random on the Instance Info node to Input on the Gradient node,
and then Color on the Gradient node to the Color input on the Surface node.
4) Open the options for the Gradient node and gradient keys at positions
0 (top) and 1 (bottom). Choose a different color for each key.
When rendered, instances will now have random colors between the two colors you set in the
Gradient node.
Instance ID #
0 1 2 3 4 5 6 7 8 9
0 3 5 9
Gradient Keys
Setting a gradient key to Stepped mode will continue the previous key color until you specify a new
one. You can also blend between colors by setting the gradient key to either Linear, Hermite or
Bezier modes.
1) Once you have instances setup, open the Surface Editor and click on the Node Editor.
2) Add the following nodes to your flow:
• Spot > Instance Info
• Gradient > Gradient
Instancing
3) Connect ID Index on the Instance Info node to Input on the Gradient node, and
then Color on the Gradient node to the Color input on the Surface node.
4) Open the options for the Gradient node and set as many gradient keys as
you desire for different surfaces. Choose a different color for each key.
The gradient keys now behave differently, as each gradient position number corresponds to the
actual instance ID number of the same value. Any instances that don’t have specific gradient keys
set will have a color in between those keys.
Tip
Using an original image sequence works with instances, but using an instanced image
sequence copied in the Image Editor fails upon scene reload. If you need to modify a sequence, then just
use a new original sequence to prevent the failure.
Material nodes are a special case when used with instancing. You can clearly use one Material
node with random coloring on its inputs, but having completely different materials applied to
individual instances is a different matter. There is a special node designed for this purpose.
The Multi Switch node allows you to switch between different Material nodes based on the
instance ID. Here’s how to set it up:
Instancing
1) Once you have instances setup, open the Surface Editor and click on the Node Editor.
2) Add the following nodes to your flow:
• Spot > Instance Info
• Materials > Multi Switch
• Materials > Car Paint
• Materials > Dielectric
• Materials > Conductor
3) Connect ID Index on the Instance Info node to Switch on the Multi Switch node, and
then Material on the Multi Switch node to the Material input on the Surface node.
4) Open the options for the Multi Switch node and click Add Input until
you have three inputs in total on the Multi Switch node.
5) Connect each Material output on the Material nodes to each
successive Material input on the Multi Switch node.
If using ID Index as the Instance Info input, every instance will cycle or switch through each of the
Material nodes you have connected in the Multi Switch node, if you want to randomize the chosen
materials, add a Random Integer node (Tools > Random Integer). Connect the ID Index to the Seed
input on the Random Integer node, then the Out node to the Switch input on the Multi Switch
node as shown in the diagram above.
Flocking
LightWave™ 11 Features
Flocking
About Flocking
LightWave’s flocking system is based on 3D computer models of coordinated animal motion,
things like flocks of birds, herds of animals or schools of fish. It can be used with LightWave’s
instancing system or HyperVoxels.
The flocking algorithm itself is based on just three principles in order to create this seemingly
complex motion. These are Separation, which keeps your flock from getting overcrowded;
Alignment, which keeps the flock all heading in roughly the same direction; and Cohesion, which is
effectively the opposite of separation and keeps your flock bunched together.
Flocking can be seen in OpenGL and VPR. Instances and HyperVoxels controlled by Flocking can be
motion blurred and use Depth of Field like other instances.
Flocking
Using Flocking
In order to set up Flocking in LightWave you will need a flock and a director. This director can
be a goal or something to avoid, meaning you can either lead or chase your flock. An effector
can be as simple as a null object though you can attach a director to geometry - for instance,
the leader of the pack or the nasty predator chasing the herd.
The director needs to move to excite the flock into movement of their own. Although an
unmoving goal will draw the flock to it, it’s not very interesting behaviorally. It should also be
stated that the behavior of the flock is completely deterministic, there is no fractal randomness
here. A flock animated on one machine will work in exactly the same way on another, and
more importantly will render the same too.
2) In the Flock Master window add a Generator. The defaults won’t give us a very
interesting result, so change the following values:
Count X: 10.0
Count Y: 10.0
Count Z: 10.0
Size X: 4.0
Size Y: 4.0
Size Z: 4.0
Flocking
3) Add a Director - Goal to your scene and move it off in the distance. For this
example ours is moved about 40 m in the Z. If you hit Calculate All Motions now
and then Play on your timeline your agents will all head off to the goal, avoiding
bumping into each other, but otherwise not interesting until they reach the
goal, at which point they overshoot and try to reach it again, and again.
4) So we’ll give them something to avoid. Add a Director - Avoid and change its Range and
Weight to 5.0 and 5.0. Position it about halfway between your flock’s starting point and
the goal. Hit Calculate All Motions again and you should get something like the image
below:
To get the pretty colored trails for your flock, increase the Show Trails number. It represents the number of frames before and after the
current one that will be shown. We also turned off Show Ranges to show the flock better.
Tip
Make sure you hit the Calculate All Motions button each time you make changes to your
flock or directors before you press Play on the timeline again.
Flocking
Overview of Controls
Flocking is composed of several custom objects all controlled through a central Master Window
as we saw in the first simple example. You can always control your flock and directors through
the Object Properties > Geometry > Custom Object window and your settings will be duplicated in
the Master Window. For simplicity, this documentation will only describe the presentation of the
settings through the Master Window interface.
This window has three main sections. Across the top we have two dropdown menus and the
Calculate All Motions button. Down the left we have a familiar listview of all the flocking elements
in the scene. On the right, the settings for each individual element broken down into tabs.
Menus
• Add New...
• Generator - will create a Null-based generator.
• Director - Goal - will create a Null-based director that your flock will gravitate towards.
• Director - Avoid - will create a Null-based director that your flock will flee.
• Director - Deflect - will create a Null-based director that is less scary than an Avoid.
• From selected Scene item
• Generator - This allows you to add a Flocking generator custom object to an
existing scene item.
• Generator from Mesh Points - If you use this option it turns the points in the object
into placement for your initial flock giving the opportunity to create initial flock
shapes other than the default box or sphere.
• Generator from Mesh Polys - As above but uses the mesh’s polygons to place
the flock. In both cases, flocking rules apply so if the points or polygons are too
closely spaced the flock will initially explode outwards.
Flocking
• Director - Goal - Adds a Director custom object to an existing scene item.
• Director - Avoid, Deflect - As above.
• Edit
• Remove Selected, Remove All - Removes either the selected flocking generator or
director, or removes them all at once.
• Duplicate - An exclusive operation (copy operations outside the Flocking window
do not affect what you have copied inside it) that allows you to duplicate Flocking
elements.
• Calculate all motions - This is the most important button in the Flock Master window.
Make sure you hit it every time before you hit play in the scene. Calculations should not
take very long to accomplish, even with complex flocking motions.
Items List
The listview on the left contains the following fields:
Flocking
Generator/Director Cache Settings
Starting with the Cache tab since that is unique for all flock elements, this tab controls caching for
the scene to make network rendering consistent and less processor-intensive.
• Use Cache toggle - Ticking this will lock the Properties tab completely, you will not be able to
make new changes to the behavior of your flock or directors.
• Cache filename - When Use Cache is on, this field is ghosted and serves only to be able to
read the cache filename and location.
• Load Cache - This button is available whether you are using a cache or not and allows you to
load a different saved solution for your flock elements.
• Save Cache - Available when Use Cache is not enabled, presents a file requester where you
can choose where your cache will be saved.
Director Properties
Directors are the Flocking elements that direct flocks. There are several choices to make when
picking the type you want:
• Goal - The flock will be drawn to this director in as direct a route as possible.
• Avoid - The flock will flee this director in as direct a route as possible.
• Deflect - The flock will flee this director in a more tangential fashion.
• Path - The animation path for this Director will be used as a path for the flock to follow.
• Converge - The flock will be drawn to this goal, but in a tangential manner.
Flocking
• Range - Leaving this at 0 will affect the whole scene. If you have multiple directors in a scene
you can give them fixed ranges and when entering the range of a second goal some will peel
off to go to the new destination. The Range circle shown in the viewport is an indicator of
the linear falloff for the attractive power of the goal with maximum at the central point and 0
at the edges.
• Weight - For a Goal or Converge Director this dictates how much overshoot an agent will
make when reaching their goal. A low weight means that the overshoot will be significant,
a higher weight will tighten the agent’s orbit around the goal. For an Avoid or Deflect
Director weight determines how far out from the central point of the director the flock will
be permitted to come. Set the weight high enough and the flock won’t enter the sphere of
influence at all.
Path Example
The Motion Path option for a Director gives you a lot of direct control over your flock, with a little
bit of chaos as some fall out of the path you give them. It creates a very natural-looking path-
following animation. Here’s a quick guide how it works:
1) Create a Null and give it a motion path over 600 frames. To simplify our starting
example make sure the path doesn’t get too close to itself over its course.
2) In the Object Properties for the Null assign a Custom Object > Flock Director. Double click
on the entry and set the Director Type to Path. Give it a big range to start with, say 8.0 .
3) Open the Flocking Master window and create a Generator. Leave it on defaults.
Flocking
4) Move the Generator away from the start of your Null’s path otherwise the flock won’t enter.
5) Now hit Calculate All Motions and your flock should find the path and follow
it. You can scale the nulls you made your path with in order to create wide
passages and narrow ones in your path. It might be good to start your motion
path with a scaled-up Null to offer a larger target for the flock to find.
Tip
For now you can only have a single Path Director in a scene, with no other types of director.
Generator Properties
There are three tabs for Generator Properties as follows:
• Generator
• Shape - This dropdown offers two choices for the starting shape of your flock Box and
Sphere. If you have used Object Properties to attach a Flocking Generator to an object
you are also given two further choices:
• Mesh Points - There are no numeric values here, a flocking agent will be placed at
every vertex of your object.
• Mesh Polygons - As above. The agents will be placed at the center of each polygon.
• Count X/Y/Z - If you are using the Box or Sphere options, you can decide how many
flocking agents you would like in your flock. For ground-based flocks use 1 in Y.
• Size X/Y/Z - If you are using a Box generator this is the length of each side, if a Sphere,
the radius in each axis.
Flocking
• Agents
• Presets - A choice of behavior types that will populate the below settings.
• Range - This is the maximum range between flocking agents. If a member of a flock
ends up outside this range it will wander off.
• View Angle - The cone of vision that the agents have once moving.
• Avoid Collisions m/s2 - By default this is set at 0.1. It can be set higher but with swarms of
flocking agents this can end up being very staccato.
• Cohesion m/s2 - This is how “sticky” a flock is. For most uses it can be left at 0, if you want
to change it do so incrementally.
• Match Velocity - This is how an agent will try to match the average velocity of all agents
in the local neighborhood (defined by the Range value). 100% means all agents within
range of each other match velocity 100% so they can never turn, only lone agents will
be free to turn around.
• Acceleration m/s2 - How fast the flock can achieve its maximum speed.
• Min Speed m/s - If the flock’s speed is not at 0, this is the minimum speed it will move.
• Max Speed m/s - How fast the flock can travel.
• Lock to Ground toggle - This will ensure that your agents stay locked to a ground plane
level you set.
• Ground Level - This setting is only available if you are using the Lock to Ground toggle
and allows you to set the ground level desired. You can envelope or texture this level.
Flocking
• Takeoff
• Initial Velocity m/s X/Y/Z - Assigns an initial direction for your flock to move in.
• Use Normals - If you have assigned your flock to be generated by the vertices or
polygons of an object, you can use their normals as a vector for initial velocity.
Complex Example
Now we’ve seen all the possibilities offered by the controls of Flocking, let’s have a more involved
example that uses other LightWave 11 features. The example is a loose guideline to steps to follow
since your own will doubtless follow a slightly different path, just like Flocking.
1) We’ll start with loading Lino Grandi’s menacing mecha fish. Use Load from Scene to bring in
the fish from \Flocking\Mecha_Fish\Scenes\Fish_Animation_Baked.lws.
2) In Layout, we’ll set our scene length to 600 frames and make a new Generator
for a flock. Use a Box with a count of 4 x 4 x 4 and a size of 5 x 5 x 5. Next visit the
Agents tab and change the preset to School. That will give us a flock of 64 fish.
Once we’re happy with how they move we will vastly increase the numbers.
3) We need to apply our fish model to our flock, so call up Object Properties for the Generator
and go to the Instancing tab. Add an Instance Generator and set it to Particles. Add object
Flocking
and choose your fish model.
4) You now have a school of 64 fish with no goal, so we’d better add one. In
the Flock Master window add a new Director and set it to goal. Move the
goal away from the fish so they have something to swim towards.
5) We have made an instanced school of fish swim towards our goal, but the animation is
fairly static and the fish stay in their initial cube shape so we need to add more interest. On
the path in-between our Generator and Goal add a new Avoid Director and set its range to
something adequate for your scene (try a range and weight of 10 m to start and if that seems
overlarge adjust). Calculate again and now your fish will split to go around the obstacle.
6) It’s still not amazingly interesting so add two or three Deflector directors and place them
offset to the side and above of the direct path, making sure that they at least overlap the
path a little and influence the flock as it passes them to the goal. Deflectors are more subtle
than Avoids and will shape your flock’s movement more slightly. They will also need ranges.
and weights.
7) Now let’s move the Goal. Scrub through the timeline and watch for when your
flock reaches the goal. At this frame, move the goal to a different location
and keyframe it in place (if Autokey is active, this will happen automatically).
Now calculate and watch the fish change direction. If the new position for
Flocking
the Goal takes the fish past the Avoid or Deflectors so much the better.
8) To get rid of our original model and only leave the instances in our scene, we need to visit
the Scene Editor and make sure that the original model is marked as inactive by unchecking
the left most column entry for it.
9) You have probably noticed that we have four entries in our Scene Editor for MechaFish. Our
fish is animated through an MDD file and to prevent all our flock from making the same
movement at the same time, we have cloned our initial fish several times and for each we
visited the Object Properties window, Deform tab and edited the MDReader so that there
was an offset in the animation and also a difference in playback rate.
Flocking
10) Now that you have an animation you are satisfied with, all that remains is for our 64
fish to become a real school. Try upping the quantity to 10 on each axis for 1,000
fish, but you’ll also want to increase the size of the box they “hatch” from because
if it’s too small they will explode out of there trying to maintain their distance. Bear
in mind also that you will probably want to increase the size of your avoid and
deflect directors so they have more impact on a much larger quantity of fish.
Tip
Don’t forget to keep hitting the Calculate All Motions button in the Flocking window each
time you make a change to your scene.
Flocking
Tip
In addition to moving your goals, try using the Envelopes for size and weight for the Avoid
and Deflect directors to create sudden disturbances that make the fish create sudden turns or billowing
in their movements.
Bullet
LightWave™ 11 Features
Bullet Dynamics
About Bullet Dynamics
LightWave 11 adds Erwin Coumans’ open source, production proven Bullet Physics Library to its
dynamics tools. This new feature is extremely simple to add to a scene but its use is a little different
to the HardFX LightWave users are familiar with.
In order to ensure dynamics calculations are free of problems, it is worth noting these guidelines:
• Your models are not very small, or very large, Bullet gives more predicable results when
working with objects sized roughly between 0.4 - 10 m.
• Any collision objects such as ground planes are not infinitely thin, but have some thickness.
• Objects made of triangles and quads will generally behave better.
• When using the built-in shapes within Bullet, you have some Collision Margin set, 10mm is a
good default.
• Your objects are prepared for dynamics, ensuring:
• Models are ‘air-tight’, meaning all points are merged and there are no un-modeled holes
in your mesh (missing polygons).
• All single points, and 2-point poly chains are removed.
• Objects don’t have extremely long, thin polygons.
Bullet Dynamics
Example
Let’s start with a simple example just to try it out. For this we won’t even need to use Modeler, since
we will use the updated Modeler Tools in Layout to create all the geometry we need for the scene.
1) To get going, we need a floor for our test object to collide with, so go to the Modeler Tools tab
> Create Geometry > Cube and set X: 10 m, Y: 0.25 m, Z: 10 m.
2) We need to add some elements to bounce off our floor, so we shall add a
default Sphere from the same menu. No need to change its settings.
3) Next, we’ll make some clones of our sphere, three should do the trick. We’ll
need to move the spheres up from Y=0 and separate them to give them
some distance to fall and you should end up with something like this:
Bullet Dynamics
4) Now we’ll go into the new FX Tools tab. Select each Sphere in turn and
click the Rigid Body button. Next select the New Ground Plane object and
click the Static Body button from the same group of buttons.
Now hit play, that’s it! For more control, click the Item Properties button in the FX Tools tab in Layout
to give your objects different properties to see how that changes things. The following section
covers the controls you have in Bullet to achieve various dynamic effects.
Overview of Controls
In Layout there is now an FX Tools tab which hosts Bullet as well as other items. Bullet’s properties
are split into the following groups:
Bullet Group:
• Enable Dynamics - This turns Bullet on and off globally for the scene.
• Item Properties - This opens the Bullet Properties on the Item tab. If any dynamic objects in
Layout are selected when this is clicked, they will be selected in the Bullet panel.
• World Properties - This opens the Bullet Properties on the World tab.
• Remove Body - Removes selected objects from Bullet, making them no longer part of any
calculations.
Dynamic Body:
• Rigid Body - The object will be subject to all the forces Bullet can offer, such as Gravity or
Density, but will not break apart.
• Parts Body - This is a great accompaniment for the new Modeler Fracture tool. It keeps your
object whole until there is a collision event.
Collision Body:
• Static Body - This is the type to add for objects that will not move or be subject to other
forces, but will react with other bodies. A floor is a good example of a static body.
• Kinematic Body - Gives the selected object characteristics similar to Static, but it can be
moved through keyframed animation, thus it is an object that is under user control. As with
Static, it affects other dynamic items but cannot be affected by them.
Visual indication in OpenGL as to which items have Bullet applied is available if you use the
Draw Collision Shapes setting in the Bullet World Properties tab. To apply this and for further
customization you need to click on the Bullet Item or World Properties buttons in the FX Tools tab.
Bullet Dynamics
• Density
• Friction
• Bounciness
• Linear and Angular Damping
• Glue Strength
• Breaking Angle
• Breaking Distance
• Merge Points
Bullet Dynamics
Item Tab
• Dynamic - This button replicates the functionality found in the list, for enabling or
disabling an individual object. (You can also double-click an item in the list to activate/
deactivate it from dynamics calculations, remember, when an item is disabled, the
dynamics solution will need re-calculating).
• Type - Presents a dropdown with the same dynamic type choices as presented in the
Layout FX Tools tab, namely Rigid, Parts, Static and Kinematic.
Shape Tab
• Shape
• Shape - The collision shape the object has for Bullet. This is a dropdown with five
options - Box, Sphere, Capsule, Cylinder, Convex Pieces and Mesh. Convex Pieces
approximates the mesh shape as a set of convex pieces. Convex pieces are faster to
simulate than Mesh shapes, but take time to pre-process (the “Decomposing” passes
you see happen). Since Convex pieces approximates the shape you can lose some
details that may or may not be important, unlike using Mesh which uses the shape
exactly. The other four are simplified shapes to make calculation swifter.
• Collision Margin - Adds a margin to the collision shape. It makes the shapes a bit thicker
and can prevent fast moving objects from tunneling through other objects.
Tip
When using the built-in shapes, a Collision Margin greater than 0 is recommended (10 mm).
It won’t offset your objects, but helps with calculation errors, this is especially true for smaller objects.
When using any of the Mesh modes, the Collision Margin will offset your geometry, but can greatly
reduce any calculation errors, so try setting to a low value, and only set to 0 mm if you really need to.
Activation Tab
• Initial Activation - Sets whether the object starts off active (affected by forces such as
gravity), or sleeping (not affected by forces until the first time the object is hit). When
an object stops moving (or close enough), it will be put to sleep, unless it is set to the
third option, Always Active. The fourth option, Activate on Last Key allows you to give an
item some momentum before it is affected by Bullet dynamics and works based on the
velocity of the motion keys you have assigned the item.
Bullet Dynamics
Two keyframes have been set for the ball at 0 and 3. When the last keyframe has passed, Bullet takes over using the velocity
and vector set by the keyframes. The distance between the two keys will give the velocity of the ball so setting a wider gap
will make for a faster trajectory.
• Deactivation Time (s) - This allows you to control when dynamic items are set to rest
by Bullet. Measured in seconds, this value sets how long a dynamic item’s linear or
rotational speed has to be under the threshold settings, before they are then put to
sleep by Bullet. This helps control any jittering of items near the end of a simulation.
Sleeping items will naturally be awoken if they are disturbed by any further collisions.
• Linear Speed Threshold - This allows you to set the threshold speed per second an item
needs to be traveling to be considered by the Deactivation Time before it is stopped by
Bullet.
• Angular Speed Threshold - Same as Linear Speed, but for rotation. This sets the
threshold speed an item needs to be rotating (degrees) to be considered by the
Deactivation Time before it is stopped by Bullet.
Properties Tab
• Mass Method - Determines how Bullet calculates the ‘heaviness’ of an item. Bullet can
calculate an object’s ‘weight’ based on mass, or density. Mass is the overall ‘weight’
regardless of what the ‘object’ is made of or how big it is. For example: you can set
a 1 m cube to be the same weight as a 10 m cube, and they will clearly be the same
weight, whereas if you set the density of a 1 m and 10 m cube to 1 kg, the 1 m cube’s
mass will be 1 kg/m cubed (1 m x 1 m x 1 m x 1 kg = 1 kg) and the 10 m cube would be
1,000 kg/m cubed (10 m x 10 m x 10 m x 1 kg = 1,000 kg). Quite a difference in Bullet’s
world! Bullet offers several calculation methods:
• Given Mass - The total overall weight of the entire object set by Mass (kg)
• Volume Density - Sets the Density (kg/unit) setting’s ‘Unit’ value to the volume area
of the object, which is measured in meters cubed.
• Area Density - Sets the Density (kg/unit) setting ‘Unit’ value to the surface area of
the object, which is measured in meters squared.
• Point Density - Sets the Density (kg/unit) setting ‘Unit’ value to use the number of
points on an object, each point’s density is the value set in Density (kg/unit).
Bullet Dynamics
• Mass (kg) - Only available when the Mass Method is set to Given Mass. This sets the total
weight of the item. For Parts mode, the amount is divided up accordingly to each part’s
overall volume, so smaller pieces will be lighter than larger pieces, as expected.
• Density (kg/unit) - Only available when Mass Method is set to any of the ‘Density’ options.
The mass of the object is then calculated by multiplying the unit specified (kg) per the
unit type set in either of the density modes. Therefore; kg x (Volume/Area/Number of
Points).
• Friction - The friction coefficient of the object. Makes objects slow down when in
contact with other items.
Tip
In order for Friction to have an effect, colliding objects must each have a non-zero Friction
value.
• Bounciness - Determines the level of rebound when objects collide with each other.
Tip
In order for Bounciness to have an effect, colliding objects must each have a non-zero
Bounciness value.
Bullet Dynamics
• Linear Damping - Dampens the linear (translational) motion of the object to slow down
its movement.
Linear D: 0% (Blocks Only) Linear Damping: 25% Linear Damping: 50% Linear Damping: 100%
• Angular Damping - Dampens the Angular (rotational) motion of the object to slow
down its spinning.
Angular D: 0% (Blocks Only) Angular Damping: 25% Angular Damping: 50% Angular Damping: 100%
When Parts is chosen as the body type for an object, additional parameters become available:
• Glue Strength - This determines how tightly bound parts are to each other. A low value
will have them explode apart at first touch, a high value will keep them together.
Glue Strength: 0% Glue Strength: 25% Glue Strength: 50% Glue Strength: 100%
Bullet Dynamics
Tip
Glue Strength with no Breaking Angle/Distance set is very dependent on the item’s scale and
Mass Method used. Be aware that if your object’s weight/mass is too low, Glue Strength can appear to
have little effect when set above 0%. Try increasing your object’s weight/mass until you see the effect.
• Breaking Angle - The threshold angle up to which parts can twist/bend before the
‘bond’ (Glue Strength) between them breaks and the parts then separate.
• Breaking Distance - The threshold distance up to which parts are allowed to move apart
before the ‘bond’ (Glue Strength) between them breaks and the parts then separate.
• Merge Points toggle - When active, Merge Points will attempt to ensure that matching
vertices on either side of a fracture have the same position while the parts are not
broken. This does have a cost in processing time and memory.
World Tab
The settings on this tab affect the whole Bullet simulation.
• Gravity X/Y/Z - Although the default is set to an Earth standard gravitational acceleration of
-9.8 m nothing prevents you from having your gravity point in any direction, or even varying
it using Envelopes and Textures.
• Dynamics Framerate (fps) - Set the frames per second for dynamics calculations. By default
this is at 180 fps. Decreasing the value will result in faster calculation times at the expense of
simulation accuracy (objects penetrating each other). Increasing will result in the opposite,
more accurate simulation, but with slower calculation times.
• Time Scale - By default this is set at 100%, however, using an Envelope you can slow down,
speed up or even reverse dynamics time independent of the scene time to create cool ‘Bullet
Time’ animations like those first popularized in the “Matrix” films.
• Draw Collision Shapes - This shows both mesh cages and the built-in shapes Bullet is using for
the simulations in the viewport.
Bullet Dynamics
• Reset - If you ever see results in Bullet that seem strange to you, it could be the cache needs
re-calculating from scratch. This button does just that, it clears and resets the cache Bullet is
using when solving the dynamics, then rebuilds it when you scrub or play the timeline.
Bullet Cache
As Bullet solves dynamic calculations, it stores the results in a cache so that animation playback
can be fluid. The cache is automatically stored in the ‘Dynamics’ folder of the current Content
Directory for the scene (this can be changed in the Options > Paths window, which can be accessed
by pressing the o key). The cache file is named the same as the scene file, but has a .dynacache
extension.
Tip
To generate the full cache for a dynamics simulation, a quick way is to go to the end frame
on the timeline, rather than pressing play. Either click on the last frame in the timeline, or bring up the
Goto Frame window (f key). Doing it this way means that although LightWave has to calculate the
simulation for the whole scene in one go, it doesn’t have to display every frame update in the viewport,
resulting in quicker calculation times.
Fracture
LightWave™ 11 Features
Fracture
About Fracture
This new Modeler tool (Multiply tab > Destroy > Fracture) breaks apart an object and makes the
resulting segments into parts for use with dynamics. There are three algorithms implemented in
the Fracture tool: Voronoi, an algorithmic way of breaking an object in a natural-looking fashion;
Halves, a recursive split-in-half algorithm; and Matrix, a matrix-cutting algorithm.
Many aspects of 3D work involve modeling properly for the intended use of the model. For
example, many articles and discussions are available on how to model characters in order to be
able to animate them properly. The flow of the polygons in the model is critical to getting the
correct deformations to occur. There are similar discussions of modeling requirements for other
purposes. For the current version of Fracture, you will get better results if you avoid geometry with
long thin polygons, or with a disparity in the size of the polygons in the mesh - large polygons
mixed with tiny ones.
Cutting tools need to be fully closed meshes. For example, a cutting tool for Fracture cannot be a
cone that you have chopped the top and bottom off of and distorted, so that you have an open
mesh with a hole in the middle; you need a closed mesh, with no holes and preferably tripled.
Lastly, any object to be Fractured needs to be frozen, not sub-patched because Fracturing will
destroy the polygon flow needed by subpatches.
Fracture
Overview of Controls
Common Settings
• Interior Surface - Entry field to set a name for the interior surfaces created in the fracture
process.
• Random Seed - Entry and minislider to adjust the random seed used for the fracturing
process.
• Preserve Original toggle - On keeps the unfractured original geometry in the layer it was
created. Off does not preserve the original geometry. The undo system in Modeler will not
always be able to recover the original geometry, so maintaining the original with this option
can be useful if you need to retry the fracturing to get the results you prefer.
• Create in - This offers the choice of either creating a completely new fractured object or
creating a new layer for the existing object.
• Part to Layers - LightWave will happily deal with the fractured object being in a single layer,
but if you need to export to a different program that needs separate layers you can use this.
Explode Controls
• Explode Parts toggle - When checked this makes an exploded Morph map of the parts.
• Explode Distance - Sets the maximum distance for the explosion. This controls how far to
move each connected part away from the center of the object.
• Explode Randomness - Specifies a percentage by which to vary the distance. For instance,
if distance is 1 m and randomness is 50%, then the pieces will move away from the center
anywhere between 0.5 m and 1 m.
• Explode Morph - Name for the exploded morph map.
Tip
Morph Maps can be examined using the M icon at the lower right of Modeler’s interface and
the dropdown menu that goes with it. This way you can see the morphed object and the base fractured
version.
Fracturing Algorithms
Voronoi
• Use - There are three choices:
• Random Points - This will create randomly positioned Voronoi cells with which to break
up your object based on the value set with the Cell Count field.
• Background Layer Points - A set of points on a separate layer to the geometry you wish
to fracture will be used for the basis of the Voronoi cells. The points need to be shown
as a background layer, similar to Boolean operations. Points outside the geometry’s
Fracture
bounds will be clamped to the bounds for the cutting operation.
• Background Layer Polys - This is something of a mix between Random and Background
Layer Points. To use it, you need to create a closed 3D shape in a layer that you have as
the BG layer for the object to be fractured, and the Cell Count field is available to fill that
shape with the required number of cells used to fracture the original object.
• Cell Count - This field is ghosted if you are using Point Selection since the number of cells is
determined by the quantity of points you use.
This image shows a Background Layer Points Voronoi fracture. Although the points are not within the bounding box of the mesh to be
fractured they still have a pleasing effect.
Background 3D shapes for cutting Voronoi Fractures in Background Layer Polys mode must be
closed objects. It is best to triple them and make sure all normals are facing outwards.
Fracture
Matrix
• Detail - The detail factor controls how many pieces are created. The value entered is
exponential (similar to subdivision surfaces) - 1 = 4 parts, 2 = 16, 3 = 64, etc.
• Jitter Amount - Controls how much to jitter the source cutters. Higher values lead to more
uneven results.
• Jitter Iterations - Controls the number of times jitter is applied. This is similar to the iterations
value in smoothing tools. Higher iterations with low values gives a different result to fewer
iterations with high values. The former would move each point around more than the latter,
which will move certain points more.
The three images show differing levels of Jitter and Iterations. 1 is no jitter, 2 is a Jitter amount of 50%, but an iteration count of 50, which is
high and 3 is a jitter amount of 90% with only three iterations giving a far more jagged appearance.
Halves
• Cutter - This determines the shape of the cutter used to fracture the objects.
• Fractal Cube - Creates more organic results - like a rock being fractured.
• Cube - Makes straight slices - very angular results.
• Max Angle - This sets the maximum angle of rotation of the cutter as it is making slices. If you
set it to 0, all slices will be axis-aligned and the results are very cube-like. With a high value,
more sharp pieces that aren’t axis-aligned are generated.
• Unevenness - This chooses how uneven the splits are, by percentage. At 0%, everything is
split down the middle. At higher values, the algorithm splits more off center. It’s a guide so
where it cuts is still random, but with higher values the results will be more random.
The three images above show: 1. Perfectly even cuts, no unevenness and no angle variation, 2. Changed unevenness to cuts that are not
exactly halves, 3. Shows adding in an angle variation.
Fracture
Fracture in Layout
Fracture is also available in a simplified version directly in Layout. You can find the tool in the FX
Tools tab in the Destroy group.
Like Fracture in Modeler, you can define the interior surface name, the random seed you wish to
use and the number of cells you wish to break your object into. The Voronoi method will be used
for fracturing and your object will be saved with the new filename you choose.
Virtual
Studio Tools
LightWave™ 11 Features
With a virtual studio, LightWave can support many of the capabilities of a real studio, such
as recording and adjusting live action, the key being ‘live’ action. Animators have always
had the ability to adjust motion in an animation at a fundamental level (key frames)
but producing the most realistic looking motion requires capturing real world data or
simulating it procedurally. The purpose of LightWave’s virtual studio is to work with real
world data by bridging the gap between LightWave and a real world studio.
Before we go into specifics, here is a presentation of the three principal windows Virtual Studio
Tools uses in Layout. These can all be found in the Virtual Studio tool group in the Top Menu
section of Layout’s menus.
This assigns your connected device to be used for the Virtual Studio. You need to click the
Enable column for the device type, this will be either a HID (Human Interface Device) compatible
peripheral, the Intersense VCam or the PlayStation 3 Move controller.
Once you have enabled your specific device, you can click on the device’s name to see the readout
of LightWave-supported Tracks.
The generic version of a HID device is purely the Windows HID manager version. The generic device
is limited in functionality, usually just to button presses; elements like accelerometers need to be
specifically programmed. If you need to choose between devices always choose the non-generic
version that has been specifically programmed for LightWave.
This is where you set up scripts using Layout commands to drive functionality for your device.
To know what commands are available go to the Utilities tab and click Save CMD List. The Space
Explorer in this example has fifteen buttons in addition to the central controller and these can
be assigned actions in the Control Booth. These settings are saved into a config file in your user
directory: %USERPROFILE%\.NewTek\LightWave\11.0\configs\ControlBooth.cfg on Windows systems,
and under OS X: ~/Library/Application Support/NewTek/LightWave/11.0/configs/ControlBooth.cfg
This is where you will record your performances using the devices you set up in the Device
Manager, using the scripts you set up in the Control Booth. You can have an unlimited number of
Listed in this window are the items that will be recorded with the Virtual Studio.
• Name - the name of the item. Double clicking on the name will open the node graph
showing the connections needed for setting it up.
• Value - is the readout of the current output from the device.
• Padlock - Allows the user to lock all the settings for the item (or group) so that no changes
can be accidentally made without first clicking the padlock again.
• A - For active. This determines whether the virtual studio will be looking at output from the
device.
• Clapper board - The take number you are on. You can click on this number to change the
take, or to add a new one.
• Snowflake - Freeze. This icons stops all evaluation from the item.
• Rec - This arms the device for recording. This is an individual record function so that you
control which traits are to be recorded.
• Play - This arms the specific channel for playback. Like record, it’s an individual function.
Note that to get VPR to resolve you need to turn off the LIVE! function so that the Virtual Studio isn’t constantly evaluating your 3D device.
The director wants to shoot a turntable of a new car and wants to direct you in the motions
you will use. We’re using the Studio_Spyder scene from the content for this example. We have
provided a scene in the provided content called CameraSetup.lws that contains nodes to set up
a 3D Connexion device for use as the camera. Use Load from Scene to extract the camera called
VirtualStudioCam from this scene. If you wish to build your own node network, this is explained
starting on page 72 of the manual.
Tip
Before anything else, make sure your Nav tab in Options is set to Device: (none)! You don’t
want to make Layout calculate the same information twice.
A Space Explorer was used for the purposes of this scene. If you have a different 3D Connexion
device, you will need to double click on the ItemMotion:VirtualStudioCam trait item in the Studio
window and open the Device node to associate your 3D Connexion device with the node network.
Once you have your new camera in the scene you will see that it is positioned badly so you need to
1) You need to make sure you have Active, Live!, Record and 1-shot all checked in the
Studio window. Up on the trait for the Item Motion: Single you want the Rec and
Play buttons checked too. You should be on Take 1. The scene only has 60 frames
so it’s going to be quick, but that’s okay for this test. You need to be comfortable
with using the 3D Connexion device to control the camera smoothly and accurately
and you can always change the playback rate of your scene if necessary.
2) When ready, press play on the scene and be ready to move your
3D mouse to capture the changes in position you want.
3) When the scene is done playing the Record button will switch off. Click the Play button
instead and play your scene back. You will see the motion you recorded. If you’re not happy
with it, you can always hit the Record and 1-shot buttons again and redo the take. If however,
you’d like to keep this take and try another, click in the Studio window in the clapper board
column. You will be presented with the possibility to choose between your takes or create a
new one.
All four takes were rendered and saved into a single scene.
Item Info > Trait - A trait is a motion modifier added to a scene item making it visible to the Virtual
Studio. Once you have added this modifier to your scene item, the scene item will become
available in the Trait node in the node editor.
Item Info > Device - A device is the tool you are using to control the Virtual Studio, whether it be a
3D Connexion peripheral like the Space Navigator or Explorer; a PlayStation 3 Move controller or an
Intersense VCam. The outputs for a specific device will be appropriate to that device.
When you add a device to your nodal network, you need to double click the node to present the
following window where you will choose the device you wish to use.
Tip
If you set up a scene with a 3D Connexion Space Explorer or similar with additional buttons
and configured nodes relating to those buttons; then use a Space Navigator with no additional buttons
the connections for the missing buttons will be lost if you save the scene. The nodes will still be there, but
the connections will have to be remade when you reconnect a more fully-featured 3D Connexion device.
Math > Vector > HPBMatrix - convert Euler angles (heading/pitch/bank) to a 3x3 matrix output as
right/up/forward vectors (each being a nodal vector type).
Math > Vector > MatrixHPB - convert a 3x3 matrix (specified as right/up/forward vectors) to a vector
of Euler angles (heading/pitch/bank).
Math > Vector > Multiply Matrix 3x3 - multiply two 3x3 matrices. For rotation matrices, this has the
effect of combining rotations. The output is another 3x3 matrix represented as right/up/forward
vectors.
Math > Vector > Transpose Matrix 3x3 - a rotation represented as a 3x3 matrix can be transposed
which has the effect of inverting it meaning the rotation is reversed.
Math > Vector > HPBAdd - combine two rotations represented as Euler angles and output the
resulting rotation as Euler angles.
This is a PlayStation3 application that uses up to four PS3 Motion controllers and the PS3 Eye
camera to track the position and rotation of each controller and sends that data across a network
to client. That client, in this case, is the LightWave 11 move.me device manager. Once accessible
to the LightWave11 device manager, the data can be used to control your scene elements and user
interface.
PlayStation3 Setup
1) You must have the necessary hardware and physically set it up. Please refer to your PS3
console for specific setup instructions such as linking the controllers to your system.
• PlayStation3 console
• PS3 Eye camera
• PS3 Motion Controller (up to four; this example specifies two)
• Network connectivity to a LightWave11 system
2) You must also acquire the PS3 move.me software application. This is available via
the PlayStation Network Store (availability in certain markets may be limited).
3) Start the move.me application and get past its instructional splash screens.
The screen should display what the camera is seeing, and the upper left
corner of the screen will show the console’s IP address with a port of 7899.
This is the server that LightWave11 will need to refer to shortly.
4) Repeat this next step for each PS3 Motion controller (also referred to as “gem”). At about
three feet from the camera (PS3 Eye), point the Motion controller directly at and inline with
the center of the camera lens, remain stationary, and press the Motion controller “Move”
button (the largest top-side button). The gem will flash and finally settle-in on a color (Each
gem will have a unique color.) You must be fairly still until the final color is visible; this is a
calibration procedure. After the calibration, do not rotate the camera as that will disrupt
the orientation calibration. As the controller is moved, you will see a sword (or other avatar)
represented on the screen. To disconnect the gem, press the ‘SELECT’ button on its left side.
This does not power off the gem, it only disconnects it from motion tracking in the move.
me application. You may then repeat the calibration as needed. It is important to note that
gems can only be connected or disconnected when no clients are connected to the move.
me application. After about 10 minutes of non-use, a gem may disconnect automatically
(the gem color will go out). If any clients are connected at that time, they must be
disconnected before the gem can be reconnected. This behavior may change in the future.
5) Look back at the console screen upper-left corner and change the IP address
in the Move.me Manager settings to it. The address format is IP:port, so make
sure there is a colon separating the IP address and the port value.
The “PS3 Move.me” manager should now have four devices (gems).
7) A device must be enabled for its data to be received; so enable the devices of interest in
the device manager panel. Highlight a device to see its data in the tracks table below and
confirm its operation by watching the data change as you manipulate the gem.
At this point, the gem devices are available to studio and control booth.
Tip
More than one PS3 console may be connected simultaneously therefore allowing more than
four gems in LightWave11.
IMPORTANT! Windows firewall rules may be preventing the data stream even though the setup
appears correct. A quick way to troubleshoot this is to disable the Windows firewall for the network
connection that the console is on and re-check for track data changes. For a direct network connection
between the LightWave11 system and the PS3 console, there is less risk in disabling the firewall; but the
connection is also to the internet, special firewall rules may be necessary. The Mac platform does not
have this issue.
Tip
The name of a device is its identity when used through LightWave and it should be unique
from other devices. Because this name will change depending on how the hardware is setup and what
hardware is available, it makes sense to rename devices to something you will use in your scenes and
on your system. That way, a simple name change in the device manager will keep away the need to
change all scene references to a device. Device names can be changed be double-clicking on its name.
The device name and active state are remembered the next time you start Layout as well.
Tip
Take recording requires mapping device data to your scene elements; this is the purpose of
studio traits. Control booth behaviors are not recorded in a take.
3) If the studio panel is not already open, you may open it in either of two ways: a)
Open the properties for the plugin and click on the “Studio” button, b) choose
the studio menu option (provided your menu configuration has it available).
4) Edit the node graph for this new trait. In the studio panel, locate the trait named
“ItemMotion:virtual camera” and double-click its name. This opens a node graph
having target node representing the trait’s position, rotation, and scale.
5) Add a device node to the node graph; this is located under the “Item Info” category.
6) Open the properties for the device node, and set the manager and
device to “PS3 Move.me” and “PS3_Gem0” respectively.
Tip
If the devices are presently available, they can be selected via the drop-down gadgets in the
panel. Otherwise, you may simply enter the names directly; however, no understanding of them will be
available to complete the node graph. You may continue on with the node graph and make the final
connections once the required devices are available. Once the node graph connections are made, they
will be remembered when reloading your scene even if the devices are no longer available.
7) Connect the device node XYZ and HPB to the trait Position and Rotation respectively.
8) In the Studio panel, enable the “LIVE!” button and make sure the value to its right is
reasonable for your system (suggested 10 to 30). This value is a evaluation update rate. It
determines how many times per second the scene is evaluated and redisplayed in the view
ports. This allows you to see scene changes as a result of gem movement even though
the scene time is not changing. Once this is done, you should now see the virtual camera
moving in your scene as move your gem.
Tip
The PS3 Eye must be able to see your gem’s colored orb to track its motion completely.
9) The scale and orientation of gem motion may not be appropriate for your scene.
Parenting the scene’s virtual camera to a null item is now useful by providing a
more flexible reference for the gem data. Create a null item called “eye origin”.
Tip
By default, the PS3 Eye is mapped to the scene’s origin and gem data is relative to this.
10) Parent “virtual camera” to “eye origin”. The virtual camera will now move relative to “eye
origin”. The “eye origin” item can be further parented to other items even animated ones.
4) We must adjust the node graph of the “.LegacyZoomFactor” trait only. Double-click its name
to open its node graph. The target node has a single value called “Value”; this represents the
trait’s value.
This zoom graph relies on the Motion controller’s ‘Move’ and ‘TPressure’ button data. The ‘Move’
track is a simple toggle where ‘1’ is pressed and ‘0’ is not pressed. The ‘TPressure’ track supplies
values from 0 to 255 depending on how hard the user squeezes the ‘T’ trigger button. When
you do not know what data values are possible for a track, look at the device manager panel in
LightWave and highlight the device in question; the available data tracks and expected format will
be displayed interactively.
The result of the graph will be this: zoom in when the ‘Move’ button is not pressed and zoom out
when it is; the ‘T’ determines how quickly to zoom.
The logic node outputs a negative value when the ‘Move’ button is pressed having the effect of
zooming out. The node output values act as a scaling to affect how fast the zoom will occur. This
output is multiplied by the ‘TPressure’ value and sent to a “pow” node, which allows the zooming to
appear more natural to that of a real camera zoom.
The previous zoom factor trait value is needed to give the node graph something to modify, since
it applies a relative amount to arrive at a final absolute trait value. The “Studio Trait” node allows us
to specify which trait to use. We choose the same trait for which this is a node graph for.
The previous trait value is combined with the ‘pow’ effect and clamped to keep the resulting zoom
factor within a reasonable range.
The above scalar affects how fast the zoom with occur. The value must be very close to 1.0.
This is a very basic zoom control. One enhancement is to add another logic node to allow multiple
zooming speeds based on the amount the ‘T’ trigger button is squeezed. A gentle press to zoom
very slowly and a hard press for maximum zoom speed.
This node graph uses the gem ‘Triangle’ track to increase the light intensity while the ‘Square’ track
decreases it.
1) Add a studio trait for the “Spotlight Cone Angle”. (Add “Virtual Studio Trait” to its channel)
2) Construct the following node graph, which is very similar
to the virtual camera zoom factor node graph.
Character Performance - A combination of various chosen takes for each trait in a character.
Character Preset - A definition of a character via its traits and connections to device channels and
scene elements.
Character Track - The desired movement a character is expected to make as the scene progresses.
Control - An interpretation of device channel data to cause an action. Controls can have child
controls.
Control Board Preset - A definition of one or more control presets logically arranged.
Preset - A collection of data used to remember settings applicable to specific aspects of the virtual
studio. It’s purpose is to save the user time and grief while setting up their virtual studio.
Prop - A scene element that does not cause a response from a character, although, they can be
manipulated by characters.
Device - A set of channels representing input. Typically, devices represent real world hardware.
Stage - A place to arrange the cast and props and define how the scene is to progress.
Trait - A trait measures something interesting about a character that can be applied to elements in
the scene.
Interchange
Tools
LightWave™ 11 Features
Interchange Tools
GoZ™
GoZ™ support has been added to LightWave 11 in both Modeler and Layout. As with any data
interchange there are certain ideal workflow practices to be aware of to achieve the best results.
You should prepare geometry with tri and quad polygons only. ZBrush™ also expects one surface
name and a single UV map on the object.
Our GoZ implementation is flexible enough to allow you to work in a variety of different ways.
You can begin your model in LightWave Modeler and send to ZBrush to create the UV map and
paint your color texture map and generate your displacement and normal maps for the object.
Then with a press of the GoZ button in ZBrush you can load the changed model back into Modeler
and your shaders will be automatically applied to the object containing all color, displacement,
and normal map information. Our LightWave Modeler GoZ implementation allows you to send an
Endomorph on your object for further sculpting to ZBrush and back to Modeler. Imagine the facial
targets and other effects you can create with this technique.
Alternatively, you could begin the model in ZBrush and use GoZ to send your sculpt to Modeler for
more detail, then go back to ZBrush to continue sculpting. You could then send your final result
directly to LightWave Layout, again with all the texture, displacement, and normal map shader
information automatically set up. In Layout, even your bump displacement will be automatically
set up on the object with the bump distance supplied by ZBrush to make it easier for you to get to
the render quickly.
Interchange Tools
Setting Up GoZ™ for LightWave
For our purposes here we will assume you have already installed and are using ZBrush and now
you would like to add LightWave to your sculpting pipeline. Load the model you wish to sculpt in
either Layout or Modeler. In Modeler it is the current foreground layer (or the selected endomorph)
that will be sent while in Layout it is the currently selected object. Now go to the I/O tab in either
application and choose GoZ. The first time you do so there will be a slight pause as LightWave
copies the necessary files to your ZBrush installation and then you will be in the ZBrush interface
waiting to draw your object onscreen as a tool. Once done and modified to your satisfaction,
pressing the GoZ button at the top right of ZBrush’s interface will send your object back to
LightWave, now with the additional sculpting.
Here is an overview of what GoZ offers in both LightWave Modeler and Layout:
Interchange Tools
maps in ZBrush and not just poly painting.
• You can see your displacement and normal maps with VPR.
Tip
If you select an endomorph in Modeler to send to ZBrush via GoZ, sculpt that endomorph
and then send back to Layout instead of Modeler the base mesh will be overwritten. Always select the
endomorph that you want to send and then make sure that same endomorph is selected on the object
in Modeler before exporting back from ZBrush.
Finally, note that if you change the geometry or polygon count in the middle of your process then
vertex maps could be lost when exporting back to Modeler or Layout. This means you should get
your geometry up to a certain level where it has enough points and polys to hold the sculpted
detail when you “res up” in ZBrush. Do your sculpting and painting at that higher resolution level
then when you hit GoZ in ZBrush to send back to LightWave the resolution automatically gets
lowered back down to the base mesh. This is the typical workflow most often used with ZBrush
and other applications.
1) We’ll add a morph map to our Mangalore head. We’ve called it Morph.Mouth_Open. Make
sure it’s selected in your Morphs list and hit the GoZ button on the Modeler I/O tab.
Interchange Tools
2) ZBrush will load or open and you can draw your warrior onscreen. Use ZBrush’s Move Brush
to spread the lips apart and hit the GoZ button in ZBrush.
3) The Mangalore head will reappear in Modeler and you’ll notice that you are still in the Morph.
Mouth_open morph.
Interchange Tools
4) Now you can create a new morph to sculpt. This time we are typing Morph.Frown. Once
added, hit the GoZ button again.
5) In ZBrush we brush in the frown for the model and send back. You can
carry on adding new morphs and sculpting them like this.
Interchange Tools
Example: Converting ZBrush Fibermeshes to FiberFX guides
ZBrush v4R3 adds the ability to export the fibers supported by ZBrush into LightWave to be used
as guides for LightWave’s own fiber solution FiberFX. To do this is very simple:
1) Start with a project inside ZBrush and comb the hair as you’d like it. Since the hair from
ZBrush will only be guides for FiberFX you will not need a large amount of fibers.
Interchange Tools
2) Save your guides out of ZBrush as an LWO file and load it into Modeler.
3) The fibers need to be converted into guides for FiberFX using the StrandMaker tool on the
Setup Tab, in the FiberFX group.
4) The new polychains will be created on a separate layer, so they need to be saved to a new
object (you can replace the existing curves on Layer 1 and save over your original object if
you prefer).
Interchange Tools
5) Send this object to Layout and add FiberFX to it. Turn on Draw to see your newly-added
fibers. You’ll see that the fibers aren’t exactly following your guides. This is because of the
Random Length and Gravity settings. Set them both to 0%.
6) If you wish to increase the quantity of fur on your object, increase the
number of fibers per guide using the Fiber Quantity setting in FiberFX.
Interchange Tools
Additional GoZ™ Notes
• Surfaces and UV Maps - Don’t forget that ZBrush only expects one of each and will discard
any others you have in your object. Make sure that if you have several of either you reduce
them before sending to avoid losing your work.
• Saving your objects - It is a good idea to save your sculpt as a ZBrush tool. In case of any
problems or wanting to continue work started before, saving the LWO from Modeler alone
will not keep your sculpting information.
• Object assemblies - If you want to sculpt an object assembly as one piece in ZBrush, make
sure the parts are separate objects (not just layers) and tick Import as SubTool in ZBrush GoZ
Preferences.
• Cache - to perform the task of shifting data between ZBrush and LightWave, ZBrush has
a directory of intermediate files. These are normally kept in C:\Users\Public\Pixologic\
GoZProjects\Default on a Windows system and /Users/Shared/Pixologic/GoZProjects/Default
on a Mac. These temporary files are not automatically deleted and can take lots of room. If
you see messages asking if you would like to use the object from disk, it is almost certainly
because your ZBrush cache already has an object by the same name in it. You can clear this
directory safely (when no GoZ process is running). Alternatively in ZBrush GoZ preferences
there is an option to clear cache files.
• Layout or Modeler? - When you first start sending your object between LightWave and
ZBrush, you will possibly begin using GoZ with Modeler to initially prepare your object for
sculpting and texturing. Once you are happy with this first stage, you can tell ZBrush to start
sending to Layout instead. To do this, can click the ‘R’ icon on the ZBrush interface near the
GoZ button to change which application GoZ communicates with. In this case, change it
over to use Layout rather than Modeler.
Interchange Tools
FBX Enhancements
FBX support has now been updated to version 2012.1 to support the most recent version of
Autodesk’s MotionBuilder™ at time of writing. Please note that Camera Export from Maya to
LightWave is only pixel-perfect when baking the envelopes in Maya, since curves are interpreted
slightly differently between applications. A command has also been added so that FBX export can
be scripted. The command Generic_ExportFBXCommand has the following arguments:
• <SceneFile> • <ExportModels>
• <Version> • <ExportAnimations>
• <ExportCameras> • <ExportMaterials>
• <ExportLights> • <EmbedMedia>
• <ExportMorphs> • <Type>
OBJ Enhancements
OBJ import and export has been further improved in LightWave 11. Large OBJs are now saved
much faster than before. OBJs textured with image maps or numeric values in other packages will
come in textured but Nodal or procedural textures will not translate in either direction. Autodesk’s
Maya™ reverses diffuse and ambient, so make sure you save OBJ files for Maya using the switch
provided on the OBJ tab of Options to ensure that your textures are transported correctly.
Interchange Tools
Unity Interchange
LightWave can now save scenes for Unity to enable a seamless transfer between the two programs.
In order to do so, make sure your LightWave scene is saved in a standard content directory format,
with your scene in Scenes, objects in Objects and all images for the scene in a directory named
Images. Ensure that your scene and object files and surface names don’t have accented characters
or spaces in them to be absolutely sure the transfer works.
Interchange Tools
Since LightWave Scene Files are not directly supported in Unity at this point, an FBX file with the
same name as the LightWave “.lws” file will be created.
Use that file in your Unity stage, it will be updated every time the corresponding LWS file changes.
A file dialog will pop up the first time Unity uses the LWS importer. It will ask for the path to your
LightWave directory and the path you specify will be remembered across sessions, if you would like
to use a different path, use the “LightWave > Choose LightWave Install Path” menu button:
Tip
Unity does not support animations with a scale of 0. For scenes where you will have
explosions or other things that require a scale of 0 in LightWave, use a very small but non-zero value.
Interchange Tools
Steps to Using Unity
In order to prepare your scene for use with Unity, here are the steps you need to follow:
1) Start Unity. Create a new project so that the Unity project folders are made and make
sure you include at least the Character Controller and LightWave applink standard assets.
2) Move your LightWave Content Directory to the Assets folder in that directory.
3) If this is the first time you have used Unity, it will ask you where your LightWave
is installed. You can also change the path to LightWave at any time by
using the LightWave > Choose LightWave path menu item in Unity.
4) In LightWave, load your LightWave scene from the Unity project Assets folder.
5) Any time you save your scene with a change made, that
change will be automatically propagated to Unity.
• LightWave lights and cameras are not carried across. The items only serve as place holders
for Unity lights and cameras in the hierarchy view if needed.
• Textures for your LightWave scene should be baked to UV maps for the most faithful
duplication inside Unity.
• Because LightWave saves objects separately from scene files, you need to save both when
you make texturing changes in LightWave. Unity’s surfacing isn’t as dense as LightWave
so you will have to either bake your surfaces (and lighting) or use image maps since
procedurals and nodes don’t translate.
• If your surface changes don’t get updated in Unity, save your objects in LightWave and
delete their surfaces in Unity (the objects should turn bright magenta in Unity). Hit save
scene again, which will update Unity. Once done, you need to go and reassociate the new
materials with the objects. You do this by clicking on the object name in the Hierarchy view
and clicking the small circle to the right of the Material line to choose a material.
Interchange Tools
Example: Realtime Visualization with Unity
One of the frequent requests we have is for some aspect of realtime visualization for architectural
scenes. Unity in its free version is very capable of not only providing a standalone executable to
give to your clients, but also a web player version so you can share using the web. Obviously this
tutorial is not to give you a full understanding of Unity, but rather a taste of what is possible. It’s up
to you to find out more about this excellent tool.
For our example we’re using the Lighting Challenge scene British Natural History Museum,
modeled by Alvaro Luna Bautista and Joel Andersdon, available here: http://bit.ly/Hbijrp. The file is
just an object so we have put it in a scene with a dome light lighting the exterior and four spherical
lights for the interior. The steps we need to take to get our scene into Unity are as follows:
1) Once you have set up the scene for your Natural History museum in LightWave, we will
create a new project inside Unity.
Interchange Tools
Interchange Tools
4) Once it has finished you will see a new item in the Project tab. In our Scenes folder there will
be a new file with the same name as your scene with a different icon. Click on this file so it
appears in your project’s Inspector pane, like so:
The Scale Factor should be set to 1 automatically, but Generate Colliders needs to be
checked. Once you’ve done this, drag the FBX file up into the Hierarchy tab. Unity will
ask if you want to apply settings, do so and your model will appear in Unity’s viewport.
Tip
The GlobalScale variable has been changed in Unity to make LightWave objects come in at
the right scale since the script is LightWave-specific. If you wish to bring in OBJs or Max files to a scene
you have created with LightWave assets you will need to apply a scaling of 0.01 to these objects inside
Unity.
Interchange Tools
5) In your project view you should see Standard Assets and if you open that you will see First
Person Controller. Drag this up to your Hierarchy. This may come in some distance from your
museum. If so, change the Position values in the Inspector window ensuring that the First
Person Controller graphic is completely within the museum. You can now delete the default
Main Camera item since the First Person Controller has its own camera.
6) Our original LightWave scene had several lights, so we will now recreate them in Unity.
Select the first light in Unity and select Light from the Component > Rendering menu.
Make this light Directional in the Inspector. Now select the other lights you placed in your
LightWave scene and choose the same item from the Component menu in Unity, but make
them Point lights, increase their range and reduce their strength.
Interchange Tools
7) Now you can hit the Play button at the top of Unity’s interface to go into your realtime
version of the scene. You can navigate by using the traditional FPS controls of WASD on a
QWERTY keyboard or the cursor keys while your mouse dictates the view. Our lighting is
quite harsh and there are no shadows. These can be present if you purchase the paid version
of Unity, or they can be baked into your scene and used as textures in Unity.
Tip
If you make changes to your LightWave scene, the changes will be propagated to Unity.
Remember that since LightWave saves objects separately to scenes you will need to save both to get any
changes you make to surfacing update in Unity.
Tip
If you follow along with this tutorial you may note you need to jump to get up the stairs and
wonder why. This is because Unity splits the single Natural History Museum object into 27 parts to fit
within the maximum of 65k vertices in a single object and that means that the steps get carved up too.
To keep the stairs in a single piece that can be climbed without jumping you will need to separate them
from the main object manually.
Interchange Tools
LightMaps in Unity
One way to make your lighting more subtle even in the free version of Unity is to use lightmaps to
recreate your LightWave lighting. Transferring these lightmaps is done by assigning them to your
surface’s Luminosity channel and Unity will automatically place them in a second material slot
dedicated to lightmaps.
Example
1) Here is a very simple scene with a box, with textures inside for the walls and floor. The
textures are images applied to the UV Map in the Color channel.
2) The only light in the scene is a dome outside the box angled so that the inside of the box is
Interchange Tools
lit partially. We’ll add a new Spherical light to the scene and position it inside the box, with a
falloff that means it just touches one wall and the floor.
3) Now we’ll set up a surface baking camera. Set it to the mesh for the box and the appropriate
UV map. Set the resolution low. We’re using 256 x 256 but you may even be able to get away
with 64 x 64.
4) Add a Compositing Buffer Export to your Image Filters (Ctrl F8) and choose Shading and/or
Radiosity.
5) Render an image. In the Image Viewer’s Layer dropdown you will see the colour rendered
image, plus an image for the Shading and/or Radiosity.
Interchange Tools
6) In the Surface Editor, add the lightmap render to the Luminosity Channel. LightWave only
supports greyscale images for any channel other than Color, but when the scene is sent to
Unity, Unity will move the Luminosity Channel mapping to a secondary UV channel for a
lightmap, which can be in color.
8) In Unity, select the object that has the lightmap and you will see that the texture concerned
is listed as Legacy Shaders/Lightmapped/Diffuse. There will be two image slots in the Texture -
one Base, which should contain your image map and one Lightmap, which may be empty. All
you need is click the Select button to add the lightmap image you rendered. The Lightmap
won’t look great in the viewport, but pressing Play should show it nicely in the game view.
Shadow Catcher
• Set up the background plate you want to composite the 3D elements onto:
• Using Background Image (Effects Panel> Compositing tab);
• Using Textured Environment (Effects Panel> Backdrop tab > Add Environment);
• or Using Image World (Effects Panel> Backdrop tab > Add Environment)
• Match the LightWave camera to the one that took the background plate, also match the
position of where the real life camera was when it took the plate shot.
• Model and position the ‘catcher’ objects that will be receiving shadows or reflections so they
align with their real life counterparts in the background plate, so 3D elements you composite
in the shot ‘interact‘ with the plate by casting shadows and reflections onto it.
• Match the lighting from the original background plate.
• Reflection
Sets the amount of reflection to show up in the catcher objects.
• Color
Allows tinting of the reflection color
• Roughness
Shadows aren’t really affected by rough surfaces, but reflections are, they become broken
up or blurred on diffuse (or rough) surfaces. This setting allows you better match the
‘roughness’ of the surface you’re compositing onto; increasing this value will cause reflection
blurring, or if you input a procedural texture into the Roughness input, allow breaking up of
the resulting reflection.
Example
Let’s take a look at setting up a relatively simple compositing exercise. This beach photograph will
provide a useful background plate to composite the car model onto, the sand is wet and provides a
perfect example to illustrate the reflection catching aspect of the Shadow Catcher node.
To help align items with the plate when we start placing objects into our scene, we need to see the
background image in the camera view. Open the Options panel > Display tab (d key) and set the
Camera View Background to Background Image. You’ll need to set the viewport type to Camera to
see the effect of this.
Go to the Modeler Tools tab > Create group > Geometry dropdown > Ground Plane
LightWave Scenes and Objects are separate files on disk, which means whenever you save in
Layout, you need to save the Scene and Objects separately. Saving the Scene will not automatically
save objects, or any Layout-created geometry. If you save a Scene and quit LightWave without saving
objects too, you will lose any changes to object surfaces or any Layout-created geometry you generated.
Now we have the ground plane created, let’s load the object we want to feature in the composite
scene. Moving and rotating the camera into the original photo’s camera position will match the
perspective. This part is very much a matter of eyeballing until it looks right.
Tip
When taking a photo for use in compositing, you should also note details that will help at the
camera matching stage, such as camera settings, height, position from any key items in the photo, time
of day, lighting conditions. Additional photos from different directions can also be used for reflection
maps.
To apply the Shadow Catcher to the ground plane we created, open the Surface Editor (F5) and
select the ‘Shadow_Catcher’ surface we named earlier.
Once connected, open the Shadow Catcher node’s options. If the background image where
your object will be placed has a reflective surface, here you can set the amount of reflection that
matches, along with any roughness you judge the “ground” may have.
For sharper shadows use a low Angle for the Dome Light, higher angles result in softer shadows,
For very diffuse overcast lighting, angles up to 90° may be required.
Enabling Radiosity - Render Globals panel > Global Illum tab (Ctrl p) also helps photorealism greatly.
FiberFX
LightWave™ 11 Features
FiberFX Enhancements
About FiberFX Enhancements
FiberFX has received many enhancements in LightWave 11. Not only is LightWave 11’s instancing
understood and used by FiberFX’s volumetric rendering, multi-threading has been implemented
for distribution and fiber building, greatly speeding up operations on today’s multi-processor
machines. Fibers using the new Solid Volume mode can now also be textured with nodes.
FiberFX Enhancements
Volume Only Solid Mode (Parametric Cylinder Rendering)
Solid mode render fibers with 3D volume at render time, in much the same way that Layout treats
2-point poly chains when you assign a negative Particle/Line Thickness for an object.
To set FiberFX to use the new Solid mode, ensure Volume Only is enabled on the main interface,
and then in the Etc tab choose the Solid volume type.
However, you can use the full power of the Node Editor to texture your fibers, including any of the
material nodes. UV maps are automatically available to Solid fibers, but do require a specific setup.
FiberFX Enhancements
Fiber Width
By default Solid fibers have an equal thickness across their length. To vary the thickness, click the T
button and select the Gradient Layer Type, then choose Fiber V as the Incoming Parameter and vary
the thickness.
Tip
Note: In previous versions of FiberFX, changing parameters from root to tip relied on the Fiber
U gradient type. With the addition of the new Solid Mode, an additional third dimension was needed.
This means any setups using the Fiber U parameter will need to be switched to Fiber V.
Tip
Fibers are generated at real world thickness when at 100%. You may need to increase the
thickness by a large amount depending on the size of your object.
Fiber Smooth
Fiber Smoothing is an important feature when using Solid fibers to create smooth-looking fibers. It
gives the same effect as increasing the amount of Edges on fibers, but since this is done at render
time it has little impact on memory while working in Layout. Fiber smoothing is also visible in VPR.
FiberFX Enhancements
Tip
Note: Setting Fiber Smoothing higher than needed can impact render times. Try incremental
settings until you hit the smoothing level you need, rather than maxing it out.
FiberFX Enhancements
Nodal UV Mapping
When using the Node Editor, you can access the automatically created UV maps for the Solid fibers.
To utilize the UV map you’ll need to add the following nodes:
The FiberFX-FiberInfoNode is where you can access all the information about the fibers that FiberFX
outputs. Add an Image node and connect the Fiber U and Fiber V from the FiberFX-FiberInfoNode
to the U Offset and V Offset inputs on the Image node. Next add a Standard material node and
connect the Color output from the Image node to the Color input on the Standard material node,
and finally connect the Material output on the Standard material node to the Material input on the
FiberFX node. The last step is to choose an image in the Image node to map along the fiber’s UV,
but importantly, set the texture scale to 0, 0, 0. Without setting the scale like this, your texture will
not be mapped correctly. The reason is outlined in the tip below.
Tip
Because the Solid fiber geometry is virtual (created at render time) we need to use a clever
trick to drive the UV lookups directly. By setting the texture map scale to 0, 0, 0 in the Image node we
can feed the UV inputs for the texture with FiberFX-FiberInfo node’s UV outputs, bypassing the internal
UV mapping mode
FiberFX Enhancements
Fiber Relax
You can now relax the fiber distribution across your object in a series of iterative steps that will
gradually quantize placement making them more evenly spread. There are 40 levels of relaxation
and each one marks an iteration, so setting it immediately to 40 will take some time to show results
depending on the quantity of fibers to be processed.
FiberFX Enhancements
FiberFX polygonization
FiberFX has the ability to polygonize the fibers you create for faster rendering if needed. Right
clicking on an item in the list on the left of the FiberFX window will get you this menu. Select
Polygonize:
FiberFX Enhancements
• Sides: 1-sided is a single poly chain; 2-sided is flat with polys on each side; >2 is an extrusion
of n sides.
• Flatten: flattens the tube.
• Triangles: make tris instead of quads.
• Billboard: makes crossed polygons suitable for mapping.
Workflow
Enhancements
LightWave™ 11 Features
Workflow Enhancements
Introduction
Features are nothing without good workflow. This is why LightWave 11 has a number of
improvements to areas of the workflow, no matter how small they made be, it all adds up to a
better user experience and faster workflow.
Align to Plane
If you have ever needed to model something off-axis in Modeler you know how hard it can be to
line things up. LightWave 11 introduces the Align to Plane function, which is located on the lower
edge of the Modeler window.
To demonstrate how it works a simple example is best. Let’s imagine we want to place a source
object - like this yellow pill - on the side of a target object - the blue sphere - and have it align
perfectly.
Ensure that both the source and target objects you want to align are in separate layers and sitting
on the “ground” (F3). Cut the source object (Ctrl x) into the clipboard, then select a polygon on the
target object where you want to align the source object, and click the ‘Align to Plane’ button.
Workflow Enhancements
Now paste the source object you previously cut (Ctrl v) and finally press the ‘Restore Align‘ button.
We could then take this further and use the Array tool (Multiply Tab > Duplicate > Array) to copy the
placed item around the object. This kind of technique is useful for placing ‘nurnies’ on a spaceship,
rivets on a ship etc. However this is just one basic example, there are many other uses for this
simple, yet useful tool.
Tip
The Align to Plane tool creates a 2-point polygon in the first free background layer to store
the rotation. The reason for this is to allow you to be able to save an object mid-way through any
modelling work during an align, quit Modeler, come back later and carry on, so when you restore the
alignment, Modeler knows exactly the transformation it needs to do to align the model back to normal .
Workflow Enhancements
Layout Modeler Tools
New Cube, Sphere and Ground Plane primitives have been added to the Create > Geometry group in
Layout’s Modeler Tools tab. The Ground Plane primitive is very useful for people wanting to set up
quick scenes to test Shadow Catching, Flocking, Instancing or more importantly to add a floor to
a scene quickly. Objects are given logical item names when created including surface names - so
that they are easy to distinguish between each other should you create several primitives at once.
The lower part of the primitive creation window has options for saving the new object immediately
at creation time. By default, saving is not enabled so as not to interrupt your workflow, but you will
need to manually save your objects to disk when and if you decide to save your scene.
Connected to Lyrs
If you create a fractured object, by default it will be all distributed on the same layer, which
LightWave is absolutely fine with. If you do need to distribute the parts of your fractured object (or
any other object for that matter) to discrete layers then this tool is ideal. It takes all unconnected
shapes in a single layer and distributes them over as many layers as needed. It can be found on the
new Layers tab in the Default Modeler menus or the Utilities tab on the Studio Production Style
menus.
Workflow Enhancements
Surface Editor Enhancements
If you have a complex scene with many objects with many surfaces, it can take a long time to
navigate to the surface you need. Just as a little workflow aid we have added the ability to right
click on a scene element and Expand all Surfaces.
Workflow Enhancements
Searching Nodes
The list of available nodes can now be searched to quickly find a specific node. There are also
options for case-sensitive searching and flat list results (the matched nodes will appear in a flat list
as opposed to an hierarchal list showing the node group they are located in). There is also a quick
‘Clear Search’ option to clear out a search and get back to the full list. All of these options are found
in the popup menu to the right of the search field.
The previous way of adding nodes via the popup menu is also still available, the small drop down
arrow next to the Add Node button is where the original node menu is located.
Workflow Enhancements
Carpaint Improvements
The two scenes are identically surfaced with the Carpaint material, except the scene on the right has no Iridescence, the one on the left has
Iridescecence at 100% with a thickness of 2.0nm and a Wavelength of X: 425 Y: 380 Z: 750.
The Carpaint node in LightWave has several new parameters: to increase the realism of your
renders.
• Iridescence
Also known as goniochromism, this describes how some surfaces change colour with
the viewing angle such as is seen with soap bubbles and sea shells. It is a percentage
value dictating how much light will be reflected from your surface and has some specific
parameters:
• Thickness
This is the distance, measured in nanometers, between the surface of your car paint and
the paint layer itself. Varying it will cause interesting effects as the wavelengths from
the following three fields complement and interfere with each other.
• Wavelength
The three Wavelength fields labelled X, Y and Z
• Clearcoat Fresnel
The Clearcoat layer in the Carpaint node now has a Fresnel function to give a better look to
the transparency it generates.
Fresnel Node
The Fresnel node now has an Inverse output in addition to its normal Result node meaning an
additional node is no longer required to invert the result.
Workflow Enhancements
NGon Light
A new light type has been added to LightWave. Called NGon it allows you to create n-sided lights
to use as bounce cards for specific studio-type lighting situations. Where before if you needed to
have a five-sided bright patch on your car hood and so used a luminous polygon, you can now
directly create a five-sided light.
Previous versions of LightWave had many options that affected the final render located in various
separate panels. In LightWave 11, these panels have been grouped together for faster access.
The Render Globals panel now hosts the Camera and Light options. The shortcuts for opening
these panels are the same as previous versions of LightWave, nothing has changed in that regard.
It simply means checking your render settings is now much easier by having all these items in
one panel under their respective tabs. The Render Globals panel can now also be accessed by the
keyboard shortcut Ctrl p.
Workflow Enhancements
VPR Enhancements
VPR has received major enhancements behind the scenes, including the ability to render lens
flares, point/line/edge/outline rendering and clip maps. LightWave 11 Instances are also fully
supported. Ctrl F9 now switches between VPR and OpenGL views in Layout.
Even while VPR is still resolving, you can now Shift Click on part of the image and the Surface Editor
will open with the selected surface, or switch to that surface if the editor is already open.
Workflow Enhancements
Limited Region Enhancements
The Limited Region gizmo in Layout’s Camera viewport has been updated to better reflect whether
you are using ‘Borders’ or ‘No Borders‘. When using ‘No Borders’, the resulting render will be
cropped to the size of the Limited Region specified, whereas ‘Borders’ will keep the render size as
set in the Camera Properties, but will only render the Limited Region portion of the image.
Borders No Borders
The Blue Noise sampling pattern has been replaced in LightWave 11 by a Low-Discrepancy
pattern that renders faster and more accurately. Low-Discrepancy is a pattern much like our
classic sampling patterns but is different for each pixel. By using Low-Discrepancy, we can
combine sampling for the anti-aliasing with the sampling for effects like soft shadows, reflections,
refractions and radiosity so that they compliment each. This allows fewer samples to produce less
noise visually and is one of the mainstays of LightWave 11’s Unified Sampling.
Workflow Enhancements
Print Camera
To aid those working in print, LightWave 11 has a utility to help work out the camera frame size (in
pixels) that would be an equivalent size in print. This replaces the previous ‘Print Assistant’ utility
and features more settings, as well as international standard paper presets. It is accessed through
the Print Camera button on the Render tab in the default and Studio Production Style menus.
• Use Template - Leaving this ticked gives you the dropdown list just underneath with a long
list of standard page sizes.
• Template - If Use Template is not checked, this dropdown list will be ghosted. If Use Template
is checked you will be presented with 47 different standard sizes of paper.
• Size Unit - Whether you use any of the standard templates or create your own measurements,
you can choose to display the page size in the units of your choice: Decimal Inches,
Centimeters, Millimeters or PostScript Picas.
• Constrain Proportions - This option is ghosted if you are using Templates but if you are
rendering to a custom sheet size, checking this option will keep the proportions the same
between width and height values you set.
• Width and Height - These values are only available if Use Template is unchecked. The
dimensions will be in the Size Unit you have chosen.
Workflow Enhancements
• Bleed - Adds the amount of bleed to the resulting camera frame size. The value is specified
by the ‘Size Unit’ setting and is added to all sides.
• DPI/PPI - This is the target resolution when printed. Images on screen have no DPI since it is
a print concept. PPI (or Pixels Per Inch) is perhaps more appropriate here, but the label DPI
has been cited on the interface for those who are more familiar with that term.
• Format - You can quickly choose between Landscape and Portrait here.
• Set Camera Frame Size - Checking this option will set the selected camera’s frame size to the
new size when you click OK to close the window. It will also set the camera Pixel Aspect
Ratio to 1.0, as any other value would make little sense for print.
Information regarding the size your image will be in pixels is shown underneath the controls, along
with a graphical representation that displays any bleed specified. The percentage value marked
in the upper left-hand corner shows an approximate scale value compared to the real size of the
page.
Tip
When you open your render in a package such as Photoshop, then look at the ‘Image Size’,
it will probably say the image DPI is 72. This does not mean that the 300 DPI/PPI image size you set in
the Print Camera settings is wrong. It simply means you need to set the image DPI that will be stored
with the document in Photoshop. When you set your print size and DPI in the Print Camera utility, it
calculates how many pixels need to be in your resulting image, so that when printed at the target DPI
you specified, it will print with the correct quality.
To change the DPI of a document in Photoshop, go to the ‘Image Size’ window, uncheck ‘Resample
Image’ and enter the DPI you want the document to be set to. Notice how the size of the image on
screen did not change. There are still the same number of pixels in the document, all you are adjusting is
how many of those pixels to use per inch when printed, which is why the ‘Document Size’ changes.
Workflow Enhancements
Render Buffer Enhancements
In previous versions of LightWave, if you wanted to create multiple buffers saved to disk you
needed to use the Image Filter Render Buffer Output, perhaps several times to be able to output all
the buffers you needed. LightWave 11 presents a new Image Filter called Compositing Buffer Export.
This allows you to set up all the buffers you wish to export in a single window.
The Compositing Buffer Export is found in the Image Filters panel called by using Ctrl F8 or visiting
the Windows > Image Processing menu item. It can be added to your scene by selecting it from the
“Add Image Filter” dropdown menu. When added and double clicked it presents a window like the
above.
• Buffer Set - This field allows you to type in a descriptive name for your buffer set. This does
not have to be the same as the Preset name. If you type a name in here, it will be the default
name presented when you save a Preset.
• Destination - This is a dropdown of four options for where to send the resulting buffers:
• Image Viewer - This will give you your final render but also the buffers you selected as
additional layers in the image viewer, the same as with the alpha that’s there by default.
• Image Files - Saves images with the base name you set, at the location you set, for every
buffer you selected.
• Rendered RGB - Saves a single buffer out as the RGB output from Render Globals.
Workflow Enhancements
• Rendered Alpha - Saves a single buffer out as the Alpha output from Render Globals.
• Negative - Inverts any greyscale channel.
• Normalize - Takes the values in the buffer chosen and ensures that they fit a “normal”
definition. For example, you are dealing with grayscale levels and your render is generating
levels between 40-783 Normalize will make sure that the values are remapped to fit within a
0-255 range.
• Buffers - There are 29 different render buffer types that you can include. For descriptions on
each of these please refer to the LightWave 10 Manual.
• Objects - This tab allows the user to only select the parts of the scene they are interested
in exporting the buffers of. By default everything is ticked, so everything in the scene is
included. You do not need to uncheck non-geometric objects since they leave no trace on
buffers.
• Select All, Select None, Invert Selection - Allow you to quickly select/deselect all, or invert your
current selection.
• Presets...
• Open Presets Shelf - This option opens the Presets Shelf. You can also use the keyboard
shortcut F8.
• Save Buffer Set Preset - If you have typed a Buffer Set name in the field at the top of the
window it will be presented here, highlighted so that you may edit it. If you have typed
no name, then whatever you enter here will also be entered in the Buffer Set name.
• Ambient Occlusion Options - Ambient Occlusion requires radiosity in your render. If you
choose Ambient Occlusion as one of the buffers you wish to export, this button will become
available to you, with additional settings as follows:
These options are similar to the options presented in the Surface Node Editor’s Shaders >
Diffuse > Occlusion Node and work in the same way. If you choose Infinite from the Mode
dropdown, Range is ghosted and the full extent of the scene will be used as the basis for the
effect. If you switch to Range then the Range field becomes available and you can set the
distance the Ambient Occlusion looks for Occlusion. In addition, there is an option for using
transparent shadows that shows the shadows for transparent items in your scene.
Workflow Enhancements
• Depth Options - If you choose Depth as one of the buffers you wish to export, this button will
become available to set further options relating to the Depth Buffer.
• Normalize Depth to 8-bit - If you are saving 8-bit per channel images this toggle should
be checked. You should uncheck it if you are saving your files as floating point/high
dynamic range images.
• Limit Buffer Depth - Normally this is ghosted and only becomes available if Normalize
Depth to 8-bit is unchecked. It allows you to define an absolute depth to your scene.
• Buffer Depth - This option is ghosted unless you have checked Limit Buffer Depth.
It determines the depth you wish your scene to have. This value can be animated
over time, or be based on a texture.
• Bilinear Filter - If you have unchecked Normalize Depth to 8-bit you can also
determine the Bilinear filter that should be used on your depth buffer. There are
four levels of sharpness available.
• Normal Options - If you choose Normal as one of the buffers you wish to export, this button
will become available to set further options relating to the Normal Buffer.
• Flip Normal X, Y and Z - matches orientation for a target application or game engine.
Workflow Enhancements
• Create Alpha Map - This places a 1-bit alpha channel into the normal pass that clips off
all areas that are not filled with geometry. This can be handy for games work where this
channel is used as a clip map in-game.
• Edge Padding - This creates a user-defined padding so that there are no seams when the
size of the map is reduced.
• Use Background Color - This is for choosing a neutral background color. By default
ticking this option will use 128, 128, 255 to the normal map rather than going to black.
• Normalize - This corrects the magnitude of the normals in the map to fit within the
“normal” range.
• Adjust 8-bit Color - There are three options in the presented Dropdown:
• None - This leaves the normals as they are, with a range between -1 to +1. Suitable
for High Dynamic Range image output
• Normalized 8-bit - This takes that range of -1 to +1 and converts it to values
between 0-255 for an 8-bit image, like a JPEG.
• No Negative - If the map consists of a range of values between -1 and +1, this
option clips everything below 0 so that only positive values are retained..
Tip
If you are using scenes created in a previous version of LightWave that used the Render Buffer
Export Image Filter, you will need to take note of the buffers you are exporting with those Image Filters
and then add the Compositing Buffer Export filter to your scene and recreate the list of buffers there. The
Render Buffer Export filter’s interface no longer opens in LightWave 11.
Workflow Enhancements
Auto Key Enhancement
The Auto Key function has been refined in LightWave 11 to allow quicker access to the various
modes it can operate in. Previously this feature was hidden away in the Options panel, but is now
located directly on the main interface in Layout. The Auto Key preference in the Options panel now
only serves as a default startup preference.
A new option now also appears in the Auto Key popup menu called ‘Auto Key: Existing‘. This mode
was available before but was somewhat hidden; it has now been made much clearer which Auto
Key mode you are in. The checkbox is now a visual indicator as to when Auto Key is active and
when it’s completely off. Auto Key now has the following modes:
Automatic Keyframing is completely off, no new keyframes will be created or existing ones
modified.
New keyframes will be created/modified, but only on the channels you edit. E.g. If you only move
an item, keys will be created/modified only for the translation channels you change, but not for
rotation or scale.
New keyframes will be created/modified on ALL channels regardless of whether you only move,
rotate or scale an item.
Only existing keyframes will be adjusted, no new keys will be ever be created, this is to allow
tweaking of keys without fear of ever creating new ones. In previous versions of LightWave this
mode was available when Auto Key Create Default in Options was set to Off, but the Auto Key
button in Layout was active. Now you just need to select this option.
Tip
Note that the Autokey indicator at the bottom of Layout works in a similiar manner to how
the original Autokey on/off button did. For your Autokey preferences to be saved, you still need to visit
the Options panel. This setting is not scene-specific.
Workflow Enhancements
Clone Instance
In addition to cloning objects in your scene with the usual Ctrl-c shortcut, you can now also clone
your current object to an instance. This adds a null with the same name as your object with the
suffix _inst. This allows you duplicate scene items while not increasing memory usage.
Workflow Enhancements
HyperVoxels Blending
The Metaball blending method in the previous version of HyperVoxels has been re-implemented
and can be used with all HyperVoxel types. Rather than just retaining the two former preset
choices (Metaballs 1 and Metaballs 2) the implementation was revised to allow for control of the
algorithm via a percentage entry field and animation of the value using an envelope.
To enable, first check on the Blending toggle. The Blending Scale field now allows for control of
the blending algorithm. The former Metaballs 1 option can be matched with a setting of 75%, and
the Metaballs 2 option with a setting of 100%.
It is now possible to use any value, including values above 100%. This helps to provide quite a
range of looks, including more fluid-like HyperVoxels setups. The setting can also be animated,
something not possible before.
Tip
The upper limit for particles, which was always set to 1 million, has been removed. Be aware
that although there is now no fixed upper limit, increasing the number of particles to more physical
memory than you have available will result in LightWave crashing.
Workflow Enhancements
Scene Loading Speed Optimizations
Loading times for Scenes have been greatly optimized in LightWave 11. The amount of gain will be
dependent on the scene, but certainly for scenes with lots of items, a significant benefit should be
noticeable.
In the motion modifiers for a Layout item there is a new Nodal motion modifier that allows you to
Workflow Enhancements
apply a nodal network’s vector output to your item’s Position, Rotation and Scale.
Located on the Modify Tab > Arrange Group in Layout, these small but useful set of tools make
aligning and equally spacing items in Layout easier. The Align tools work in much the same way
as the Align to Last Point tools in Modeler, but for any Layout Scene items. To use them select the
items you wish to align, then select the last item to which you want to align the other items. From
the Align menu choose which axis (X/Y/Z) you wish to align against. Once clicked, all the items you
selected will be aligned to the last item’s axis you picked.
Selected items, dark blue sphere was the last selected item All items are now aligned on the chosen Y axis
To distribute, or equally space items, you don’t need to worry about the selection order, simply
select the items you wish to equally space between each other, and choose the axis (X/Y/Z) you
want to space the items along.
Workflow Enhancements
Items aligned on the Z axis, but unequally spaced Items equally spaced between each other along the X axis
Please Note: Due to the way Generic Scripts operate in Layout, these tools will not contribute to
any Undo. Make sure you save your scene if you are unsure whether you want to keep the changes, or
you will lose the original positions of the items.
Python™
LightWave™ 11 Features
Python Scripting
About Python Scripting
Python is incorporated into both Modeler and Layout and can be accessed directly using the
console button in either. You can also load individual Python scripts by clicking on the Python
button on the interface. Python scripts can be added to the interface in exactly the same manner
as LScripts meaning that the end-user will see no difference between a script and a native function
(other than speed of execution).
Access to Python is identical between Layout and Modeler. In the Utilities tab there are two
buttons in the Python group. Open Console opens a Python console that allows you to immediately
execute commands within the Python environment, while the Python button opens a file requester
for you to choose a Python file to execute.
Python has no commands in the way LScript does. In Python commands are executed in the same
way you would from a C++ plugin, as text commands. For a list of the commands available to the
lwsdk Python module, please refer to the commands.html section of the LightWave SDK.
Python Resources
A LightWave manual is not going to teach you how to program in Python, but fortunately there are
lots of people out there who can and it doesn’t have to cost anything. In addition to the example
scripts and SDK installed with LightWave there are some very good sites offering free courses:
http://docs.python.org/tutorial/index.html - A good place to start is with the docs for Python itself.
http://www.diveintopython.net/ - An open source guide for programmers who are new to Python,
but not programming in general. Available in several languages.
Python Scripting
Open Console
LightWave’s Python console serves two purposes. It can act as a single-line interpreter for single
commands to be passed straight to Python, or if you click on the down arrow on the right side of
the window, you can change the console into a multi-line editor. When in multi-line mode, you
can write an entire script using the editor, and only when you press Ctrl Enter will the script be
executed.
• Clear Log clears the output window at the top of the console of any command history and
output (both informational and error) from the Python interpreter.
• Save Log will save the entire log to a text file.
• Open on Error will open the console if a script has a problem, whether it be written in the
console itself, or loaded from disk with the Python button.
• Max Lines sets how much history to keep in the output window, the default is 100 lines. A
setting of 0 here will be unlimited history. The history is only kept for the current LightWave
session, if you quit LightWave without saving the log it will be lost.
Python Scripting
Python
This button will allow you to load and execute an existing Python script from disk. When Python
loads a script, it automatically saves a compiled (*.pyc) version of the text script (*.py). Python
will only skip this stage if the timestamp on the text file version is older or equal to any compiled
version that may already exist.
However, if the text version is newer than any existing compiled version, even selecting the
compiled script will cause Python to re-process the text script and create a new compiled version
which will overwrite the older one.
If the compiled version is the only file available, Python will simply run it. This means for those
wishing to protect their Python source, it is possible to distribute only the compiled version.
Python Scripting
Writing a Python plug-in for LightWave 11
This is a brief article intended to detail the design philosophy and best-practices approaches to
writing a Python plug-in for LightWave 11.
LightWave’s plug-in API is designed with this type of “factory” approach in mind. When a plug-in
is activated, a Factory is created. Once available, a Factory knows how to “create” and “destroy” its
product. Python for LightWave adopts this design by having an actual Factory class defined for
each architecture that LightWave will employ when it needs to create or destroy an instance. These
Factories produce instances of a plug-in for LightWave to consume, and then determines their
dispositions when LightWave indicates that they no longer needed.
For each plug-in type currently available in Python for LightWave, there is a corresponding Factory
defined to manage its instances:
• GenericFactory
• CommandSequenceFactory
• ShaderFactory
• ObjReplacementFactory
• DisplacementFactory
• LayoutToolFactory
• ChannelFactory
• ItemMotionFactory
Python Scripting
• MasterFactory
• CustomObjFactory
• ImageFilterFactory
Programming by Contract
“Programming by Contract” is a fancy software term that really just means that some component
of a product guarantees to provide a given set of unchanging functionality, or a known “interface,”
and by doing so, different products that provide this same level of functionality can be reliably
interchanged.
For example, every motor vehicle on the planet guarantees to provide a given set of common
and immutable features and abilities, such as an engine capable of moving the vehicle, a braking
system capable of stopping the vehicle, and a steering mechanism whereby the vehicle can be
maneuvered. Although there are thousands of different types of motor vehicles in the world,
once you have mastered the common features of one, you pretty much know how to use them all,
regardless of how they look or who made them. A car is a car when your goal is getting from point
A to point B, and so every car becomes interchangeable to you.
The plug-in types that you can create in Python for LightWave each has a corresponding interface
from which your own Python classes must inherit. To denote that they are interfaces, and therefore
represent the “contract” between your class and LightWave, each is prefixed by a capital “I”:
• IGeneric
• ICommandSequence
• IShader
• IObjReplacement
• IDisplacement
• ILayoutTool
• IChannel
• IItemMotion
Python Scripting
• IMaster
• ICustomObj
• IImageFilter
Typically, there will be an inheritance “chain” from which a plug-in interface will inherit before it
ever reaches the Python level. For example, one of the available plug-in interfaces that a Python
plug-in can inherit from is called “IDisplacement” (remember that the “I” prefix indicates that this
Python class is an interface, the paper upon which your “contract” with LightWave is written). The
“IDisplacement” plug-in interface itself inherits from several other interfaces: “IHandlerFuncs”,
“IInterfaceFuncs”, “IGizmoFuncs”, and “IItemFuncs” and “IRenderFuncs”. Each of these interfaces
from which “IDisplacement” inherits contribute their own methods -- and sometimes, data
members -- to the sum of the “IDisplacement” interface.
Tip
Don’t start sweating just yet; this is just informational. All of this is internal, and it is
available to the plug-in writer by the time it reaches Python. And remember that you can always see
the inheritance hierarchy of an exported class by using the Python help() command on it (see the “PCore
Console” article for an example of this).
So, by the time the Python plug-in writer uses a plug-in interface like “IDisplacement”, many
methods are automatically added to the class defined for the Python plug-in. In order to help
distinguish methods, these “grandparent” classes from which a plug-in interface inherits will
have prefixes added to their method names. These prefixes are intended to not only help with
readability (i.e., you can look at a Python plug-in and know pretty quickly which methods are
inherited), but they also provide tidy name spaces to keep methods separated. In other words, you
won’t accidentally override an interface method by using your own method name. Most methods
that can be overridden will be prefixed so there’s no confusion. Let’s continue the example by
looking at a Displacement Map Python plug-in.
Python Scripting
Anatomy of a Python Plug-in
One of the example Python scripts provided with the SDK is called “lazy_points.py”. This is a
Displacement Map plug-in, based on several incarnations that preceeded it. I won’t reproduce the
entire script here; I will simply highlight the salient parts to augment our discussion.
In order to use the elements of the LightWave SDK in your Python script, you must import the
“lwsdk” module. This gives you access all the LightWave SDK elements.
import lwsdk
Using this form of “import” requires that all elements imported from this module be accessed
by their name space prefix (in this case, “lwsdk.”). You may import all the elements within the
“lwsdk” name space into your own, so that they can be referenced without the prefix, but it’s good
practice to keep the prefix in place, as it avoids collisions between identically named elements, and
provides better readability to your script.
With the “lwsdk” module imported, we can begin creating a new class that inherts from a valid
LightWave plug-in interface (in this case, “IDisplacement”).
class lazy_points(lwsdk.IDisplacement):
...
This declaration creates a new class based on the “IDisplacement” plug-in interface. At this point,
we have agreed to a “contract” with LightWave that guarantees we will provide an unchanging
set of known functionalities that will make our class respond to LightWave like an “IDisplacement”
interface, no matter how much unique content we add to the class.
Once the class is defined, you are free to start adding your own methods to the class. However,
in order to provide any useful functionality to LightWave, you’ll need to override some of the
inherited methods. First up, we will define a method that overrides the entry point for the UI
callback (inherited from the “IInterfaceFuncs” interface, which paraphrases elements found in
the SDK’s LWInterface structure). As mentioned previously, the “IInterfaceFuncs” interface is a
“grandparent” inheritance of our class, so the methods it contributes to “IDisplacement” will have a
name space prefix called “inter_” to uniquely identify them:
# LWInterfaceFuncs ------------------------------------
def inter_ui(self):
...
For those familiar with the LightWave SDK, this Python method maps directly to the options()
callback in the LWInterface structure. It is within this callback that the plug-in’s user interface
should be constructed and displayed.
Python Scripting
Tip
To support GUIs in Python plug-ins, the LWPanels UI toolkit has been hand-wrapped for use
by Python plug-ins, and examples of this can be found in the provided sample plug-ins.
It should be noted at this point that XPanels is not currently supported in the LightWave 11
Python implementation. Support for this may come in later releases.
# LWRenderFuncs ---------------------------------------
def rend_newTime(self, frame, time):
...
Like “IDisplacement,” most Python plug-in interfaces also inherit the “IHandlerFuncs” interface (a
paraphrase of the LWInstanceFuncs structure), which provides the load(), save(), copy() and
descln() methods, each with the “grandparent” identifier prefix of “inst_”.
# LWInstanceFuncs -------------------------------------
def inst_load(self, state):
...
def inst_descln(self):
...
Something to note is that the functionality of the LWInstanceFuncs structure has been split
between two different entities within Python for LightWave. As noted above, the load(),
save(), copy() and descln() functions have been placed into the plug-in interface domain,
while the create() and destroy() functions have been more properly delegated as Factory
Python Scripting
functionalities.
# LWItemFuncs ----------------------------------------
def item_changeID(self, itemid_list):
...
Finally, the “IDisplacement” plug-in interface itself defines its own methods, evaluate() and
flags(), which your class needs to override to implement the Displacement Map functionality.
Because your class inherits directly from “IDisplacement”, these methods are considered to be your
property, and as such, they are not prefixed.
# LWDisplacement --------------------------------------
def flags(self):
...
Producing Instances
Once you have constructed a subclass of a valid plug-in interface (for example, “lazy_points”), you’ll
need to provide a Factory that will be responsible for managing its instances. As noted previously,
a Factory has been defined for each available Python plug-in interface in LightWave 11, and an
instance of each will need to be created to manage the life cycle of each instance of your plug-in
interface subclass.
A Python plug-in Factory has a simple implementation. Continuing with the Displacement Map
example, the Factory defined for the “IDisplacement” interface looks like:
class DisplacementFactory(IDisplacementFactory):
“”” Default Python Factory for IDisplacement “””
Python Scripting
def name(self):
return self._plugin_name
As you can see, the Python Factory implements the “IDisplacementFactory” interface, which itself
inherits from a C++ class called “IFactoryFuncs”. This inheritance chain provides methods like
create(), destroy() and name(). These methods are called by LightWave to produce instances
of your plug-in interface class, and to dispose of the instance later when it is no longer required.
Initializing a Factory instance requires a name for the plug-in class. This is used by LightWave to
identify the class of the plug-in. This name corresponds identically to the value you would provide
in the “name” data value when constructing a ServerRecord for a native-code plug-in. In addition,
a reference to the Python class to be instantiated must also be provided to the Factory instance
when it is constructed (this is the “klass” argument to the initialization code above).
As you can see from the Factory internals snippet above, this information is used to generate
instances of the referenced class. When each class instance is created, the architecture-specific
context information passed in by the application is forwarded on to the instance for its use (see the
LightWave SDK documentation to know what type of context it expects, if any). As instances are
created by the Factory, links are maintained within the Factory to the instance, and released by the
Factory only when the destroy() method is invoked for that instance. This behavior actually has
a very practical application: A link must be maintained to the created instance to prevent Python’s
garbage collector from reclaiming it at inopportune times. Only when the destroy() method
is called, and the instance is removed from the internal list (thereby decrementing its internal
reference count within Python), can it safely be reclaimed.
All these details are handled for you automatically by the default Factory implementations.
However, you need to be aware of how a Factory functions (and why) should you wish to subclass a
particular Factory type to provide your own custom functionality.
Tip
The provided Master Python example does just this, subclassing the “MasterFactory” class
with some trivial functionality, and then using this subclass to manage the plug-in instance data life
cycle instead of the default Factory implementation.
Python Scripting
complete solution. You will need to define two final elements in order to make your plug-in live
within the running application. These items are the ServerTagInfo and ServerRecord structures.
ServerTagInfo = [
( “Python Lazy Points”, lwsdk.SRVTAG_USERNAME | lwsdk.LANGID_USENGLISH
),
( “Lazy Points”, lwsdk.SRVTAG_BUTTONNAME | lwsdk.LANGID_USENGLISH ),
( “Utilities/Python”, lwsdk.SRVTAG_MENU | lwsdk.LANGID_USENGLISH )
]
Here we define the SVRTAG_USERNAME and SVRTAG_BUTTONNAME values that provide text
values that will be displayed to the user in various contexts, and the location hint using SVRTAG_
MENU where the plug-in’s presence should appear on the application’s interface. The types of
values available for definition in a ServerTagInfo structure can be found in the LightWave SDK
documentation.
With a defined ServerTagInfo (one for each plug-in instance to be managed), a ServerRecord must
be constructed at the top-level of the module’s environment (i.e., as a “global” variable):
A ServerRecord is a Python dictionary (a series of key/value pairs). As you can see in the snippet, an
instance of the Factory that will manage instances of the “interface” class is used as the key for the
dictionary entry, and the ServerTagInfo defined for the plug-in class is used as the “value” for the
pair. Any number of key/value pairs can be defined within the ServerRecord, one for each plug-in
class definition to be found within the file.
While any number of ServerTagInfo’s can be defined (each with a unique name, of course), only one
ServerRecord instance should exist in your script, and it must be named “ServerRecord” to ensure
that the PCore system can locate it.
Bob Hood, the author of this article and NewTek developer since LightWave started, has also made an
LScript to Python converter. It’s available at http://www.lucidgears.com:21134/
Information
Updates
For updates to LightWave and this manual, please check your user account at:
http://register.newtek.com
LightWave Community
To connect with other LightWave artists for tips, tricks and tutorials, visit our online community:
http://forums.newtek.com
All other brand names, product names, or trademarks belong to their respective holders.