User's Guide, XL2 Plus Analyzer, V 8.5
User's Guide, XL2 Plus Analyzer, V 8.5
User's Guide, XL2 Plus Analyzer, V 8.5
Thermo Fisher Scientific Inc. provides this document to its customers with a product purchase to use in the
product operation. This document is copyright protected and any reproduction of the whole or any part of this
document is strictly prohibited, except with the written authorization of Thermo Fisher Scientific Inc.
The contents of this document are subject to change without notice. All technical information in this
document is for reference purposes only. System configurations and specifications in this document supersede
all previous information received by the purchaser.
Thermo Fisher Scientific Inc. makes no representations that this document is complete, accurate or error-
free and assumes no responsibility and will not be liable for any errors, omissions, damage or loss that might
result from any use of this document, even if the information in the document is followed properly.
This document is not part of any sales contract between Thermo Fisher Scientific Inc. and a purchaser. This
document shall in no way govern or modify any Terms and Conditions of Sale, which Terms and Conditions of
Sale shall govern all conflicting information between the two documents.
Contents
Contact Us
Manual Overview
Contents
• “Warnings, Cautions, and Notes” on page 1
• “Figures” on page 2
• “Physical Buttons” on page 2
• “Other Hardware” on page 3
Example Warning:
WARNING Tampering with the 5,500 ppm (Lead high) lead-in-soil standard may cause
exposure to lead dust. Keep all standards out of reach of children.
Cautions
Cautions are important recommendations. Cautions will always be identified as Cautions in
the text, and will always be visually presented as follows:
CAUTION This is a Caution.
Example Caution:
CAUTION Never tamper with Test Standards. They should not be used unless they are
completely intact
Notes
Notes are informational asides which may help you with your analyses. Notes will always be
identified as Notes in the text, and will always be visually presented as follows:
Note This is a Note.
Example Note:
Note For defensible Quality Control, keep a record of the time and precision of every
calibration
Figures
Figures are illustrations used to show what something looks like. Figures will always be labeled
and identified as Figures directly below the Figure itself, and will always be visually presented
as follows:
Physical Buttons
Physical Buttons are actual buttons on the analyzer which must be pushed to activate their
function. Physical Buttons will always be identified as Buttons in the text, and will always be
visually presented as follows:
Other Hardware
Other Hardware refers to any physical part of the analyzer which performs a necessary
function. Other Hardware will always be visually presented as follows:
Contents
• “Intended Use” on page 5
• “Safely and Effectively Using Your Analyzer” on page 6
• “Operational Specifications for Niton XL2 Plus Analyzer” on page 20
• “Radiation and Compliance Labels” on page 21
• “Battery Installation and Charging” on page 23
• “Hot Swap Feature” on page 26
• “Emergency Response Information” on page 26
This section discusses the basics of using your analyzer. Radiation safety is covered first,
because using an X-ray based analyzer safely is very important. Secondly, we outline the daily
startup procedure to ensure that your analyzer is performing properly and at its most efficient
level.
Intended Use
Get fast, accurate metal alloy verification for manufacturing quality assurance with the
Thermo Scientific™ Niton™ XL2 Plus Analyzer. The XL2 Plus Analyzer provides immediate,
nondestructive elemental analysis of alloy materials from titanium to nickel as well as tramp
and trace element analysis. Lightweight, rugged handheld Niton XL2 Plus Analyzers are well
suited for a growing list of applications including scrap metal identification, mining and
exploration, and lead screening for consumer and electronic goods.
Navigation
Use the touch screen or keypad to access XL2 Plus features.
Touch Screen
CAUTION Niton analyzers are not intrinsically safe analyzers. All pertinent Hot Work
procedures should be followed in areas of concern.
CAUTION If the equipment is used in a manner not specified by Thermo, the protection
provided by the equipment may be impaired.
WARNING Always treat radiation with respect. Do not hold your analyzer near the
measurement window during testing. Never point your analyzer at yourself or anyone else
when the shutter is open.
Serviceable Parts
There are no user serviceable parts. All repairs must be performed by factory trained
technicians.
Ergonomics
The Niton XL2 Plus analyzer is ergonomically designed. This lightweight, compact, and
balanced instrument minimizes strain on the user, especially on extended jobs. Three separate
test stands provide added ergonomic support for development, calibration or anytime you
need to test samples repeatedly and for long periods of time. For more information see “Test
Stands” on page 165.
Unintentional Misuse
Take caution to avoid unintentional misuse, which is not covered on the service warranty.
Examples of unintentional misuse include:
• letting the analyzer slip out of your hand in a damp environment
• standing the analyzer upright on the battery, putting it at high risk of falling off a counter
or desktop
Time
The longer you are exposed to a source of radiation the longer the radiation is able to interact
in your body and the greater the dose you receive. Dose increases in direct proportion to
length of exposure.
Distance
The closer you are to a source of radiation, the more radiation strikes you. Based on geometry
alone, dose increases and decreases with an inverse-squared relation to your distance from the
source of radiation (additional dose rate reduction comes from air attenuation). For example,
the radiation dose one foot from a source is nine times greater than the dose three feet from
the source. Remember to keep your hands and all body parts away from the front end of the
analyzer when the shutter is open to minimize your exposure.
Shielding
Shielding is any material that is placed between you and the radiation source. The more
material between you and the source, or the denser the material, the less you will be exposed
to that radiation. Supplied or optional test stands are an additional source of shielding for
analysis. A backscatter shield accessory is also available and may be appropriate in some
applications.
Exposure to Radiation
Human dose to radiation is typically measured in rem, or in one-thousandths of a rem, called
millirem (mrem), 1 rem = 1000 mrem. Another unit of dose is the Sievert (Sv), 1 Sv = 100
rem. The allowable limit for occupational exposure in the U.S (and many other countries) is
5,000 mrem/year (50 mSv/year) for deep (penetrating) dose and 50,000 mrem/year (500
mSv/year) for shallow (i.e., skin) dose or dose to extremities. Deep, shallow, and extremity
exposure from a properly used Niton XL2 Plus analyzer should be less than 200 mrem per
year, (2.0 mSv per year) even if the analyzer is used as much as 2,000 hours per year, with the
shutter open continuously. The only anticipated exceptions to the 200 mrem maximum
annual dose are: 1) routine and frequent analysis of plastic samples without use of a test stand,
backscatter shield, or similar additional protective measures, or 2) improper use where a part
of the body is in the primary beam path.
Note NEVER OPERATE THE DEVICE WITH A PART OF YOUR BODY IN THE
PRIMARY BEAM PATH OR WITH THE PRIMARY BEAM PATH DIRECTED AT
ANYONE ELSE.
Also, consider the use of protective accessories such as a shielded test stand or backscatter
shield (or equivalent) when performing routine and/or frequent analysis of any of the
following:
• light materials (such as plastic, wood, or similarly low density/low atomic mass
samples)
• thin samples (such as foils, circuit boards, and wires)
• samples that are smaller than the analysis window.
Shown in Table 1 are the typical background radiation doses received by the average member
of the public. The radiation dose limits for radiation workers in the US are also shown in
Table 2.
Two common types of dosimeters are whole-body badges and ring badges. Whole body
badges are often attached to the user’s torso (e.g., clipped to the collar, shirt pocket, or waist as
appropriate). A ring badge is worn on the finger as a measure of maximum extremity dose.
When worn, the specific location of the dosimeter should be that part of the body that is
expected to receive the highest dose. This location will depend on how the analyzer is used
and so it may not be the same for all users. Dosimetry services are offered by many companies.
Two companies offering dosimetry services in the USA and much of the world are:
Note Wearing a dosimeter badge does not protect you against radiation exposure. A
dosimeter badge only measures your exposure (at the dosimeter location).
* The National Council on Radiation Protection and Measurements (NCRP) was chartered
by the U.S. Congress in 1964 as the National Council on Radiation Protection and
Measurements.
The XL2 Plus Analyzer is equipped with Detector ProGuard, a detector protection grid.
While the Detector ProGuard helps to minimize accidental detector punctures, care must be
taken to avoid introducing sharp objects to the analyzer measurement window area. Under
Manufacturer standard terms and conditions, detector punctures will not be covered under
warranty.
Avoid Over-Exposures
Direct contact with the window could result in overexposures in the times indicated inTable 3
below.
Small Samples
A small sample would be any sample that is smaller than the measurement window. Small
samples present a unique risk because they don’t block the entire beam path. The difficulty
with placing small samples down on a work surface to analyze them is that you may get
readings from the work surface that interfere with analytical results. A test stand is an effective
way of analyzing small samples accurately and safely. Never hold samples during analysis or
look into the path of the primary beam.
SN: 30776
Notes:
Scatter measurements were taken at a radius of 5 or 30 cm around the nose of the analyzer
with the highest scatter dose rate being recorded.
Notes:
In beam dose rates were measured using optically stimulated luminescent (OSL) dosimeters.
Reported results are based on measurement results that have been reduced to 2 significant
digits by rounding up. For example, a measurement result of 1441 would be reported as 1500.
Primary Radiation
Primary radiation is radiation that is produced by the analyzer and emitted out through the
measurement window. Individuals should never place any part of their body in the primary
beam path when the x-ray tube is on. There should always be a sample in contact with the
measurement window when the x-ray tube is on. The sample will absorb most of the
primary-beam radiation unless it is smaller than the instrument's measurement window or of
low atomic mass, low density, and/or very thin. Caution should be taken when analyzing
samples that are small, thin, and/or low in atomic mass or density as they may allow much
more of the primary beam to escape.
Secondary Radiation
Under conditions of normal and proper use, individuals can be exposed to secondary (or
“scattered”) radiation. Secondary radiation is low-level radiation that emanates from the
sample being analyzed as a result of primary beam radiation scattering in the sample or
primary beam radiation inducing fluorescent x-rays in the sample. Dose points A, A’ and B in
Figure 5 are examples of where you can encounter secondary radiation. The magnitude of this
secondary radiation is sample dependent. Higher atomic mass and density samples such as
steel will emit the lowest levels as they absorb most primary and secondary radiations. Lower
atomic mass and density samples such as aluminum, wood, and especially plastic, will produce
higher levels of secondary radiation.
The operator is reminded that one should never hold samples during analysis, doing so will
result in higher than necessary exposure to secondary radiation and could expose the operator
directly to the much higher primary-beam dose rates.
Shallow dose is often referred to as “skin dose” because it is a result of low penetrating
radiation that only interacts with the skin. Shallow dose is limited to a maximum of 50 rem
(500 mSv) per year in the United States and most countries internationally. Shallow dose is
listed for primary in-beam dose points only because the low penetrating radiation that causes
shallow dose is nearly all absorbed by a sample and does not produce any significant secondary
radiation. Shallow dose is measured at a point 0.007 cm below the surface.
Transportation
There are no X-ray tube specific US Department of Transportation (DOT) or International
Air Transport Association (IATA) radiation regulations regarding shipping the Niton XL2
Plus analyzer. It is recommended that you ship the analyzer in its carrying case and an
over-pack to protect the sensitive measuring equipment inside the analyzer. Do NOT ship the
analyzer with the battery pack connected to the analyzer.
If the Niton XL2 Plus analyzer is lost or stolen, notify your Radiation Safety Officer (RSO) or
the equivalent responsible individual at your company or institution immediately. Your
company's RSO, as well as other important emergency contacts, are listed below. Your
company RSO may need to notify the x-ray tube regulatory authority and the local police. It
is also recommended that a notification is made to Thermo Fisher Scientific.
Damaged Instrument
Minor Damage
If the instrument is intact but there is indication of an unsafe condition such as a cracked case,
a shutter mechanism failure, or the lights remain flashing after a measurement is terminated,
follow these steps:
1. Stop using the instrument
2. Remove the battery. The x-ray tube can not produce radiation when the battery is
disconnected. The instrument is now safe to handle.
3. Place the instrument securely in the holster.
4. Place the instrument in the carrying case that came with the instrument.
5. Notify your Radiation Safety Officer (RSO) or the equivalent responsible individual at
your company or institution immediately.
6. You or your RSO should call Thermo Fisher Scientific at one of their contact numbers
listed below for additional instructions and guidance.
Major Damage
If the instrument is severely damaged:
1. Perform the same steps as described above for minor damage. There will be no radiation
hazard as long as the battery is removed from the instrument.
2. Place all components in a plastic bag and contact Thermo Fisher Scientific.
CAUTION The battery used in this device may present a risk of fire or chemical burn if
mistreated. Do no disassemble, heat above 50C, or incinerate. Replace battery with Thermo
Fisher Scientific P/N 420-002 only. Use of another battery may cause fire or explosion.
Dispose of used battery promptly. Keep away from children. Do not disassemble and do not
dispose of in fire.
1. Slide back the catch on the bottom of your analyzer’s pistol grip and drop the battery out
into your hand.
2. Place the old battery aside and slide the new battery up into the cavity in the bottom of
the pistol grip. The battery is keyed, and will only insert fully one way.
3. Press in until the latch resets. Do not force the battery into the cavity.
3. Place the battery pack upside down into the charger. The battery pack is keyed, and will
only fit into the charger fully one way. If your battery pack is resting on the back of the
charger rather than sliding all the way to the bottom, remove the battery pack, turn it
around, and re-insert it into the charger.
CAUTION Do not let the battery pack recharge for excessive periods of time.
Note If you take more than 15 seconds to swap in a new battery, you may need to reboot
the analyzer. Measurements cannot be initiated while instrument is in Hot Swap mode.
Instrument must have a battery installed or must be plugged-in to a power source.
Europe
Niton Analyzers Europe
Munich, Germany
Email: [email protected]
Asia
Niton Analyzers Asia
Hong Kong
Email: [email protected]
Common Operations
Contents
• “Setting Up Beep Times” on page 29
• “Sorting the Custom Element Display” on page 30
• “Max Measure Time” on page 33
• “Minimum Test Time” on page 33
• “Virtual Keyboard” on page 34
• “Setting Display Units” on page 36
• “Adjusting the Element Range” on page 38
• “Setting the Date and Time” on page 40
• “Calibrating the Touch Screen” on page 42
First Beep
This option allows you to change the seconds of delay before the First Beep. Select the screen
button labeled with the number of seconds of delay for the First Beep. The Beep One Time
editor will open. Clear the current number of seconds with the “C” button, then select the E
button to enter the information.
Second Beep
This option allows you to change the seconds of delay before the Second Beep. Select the
screen button labeled with the number of seconds of delay for the Second Beep. The Beep
Two Time editor will open. Clear the current number of seconds with the “C” button, then
select the E button to enter the information.
Third Beep
This option allows you to change the seconds of delay before the Third Beep. Select the screen
button labeled with the number of seconds of delay for the Third Beep. The Beep Three Time
editor will open. Clear the current number of seconds with the “C” button, then select the E
button to enter the information.
On the left of the display are elements, each with its currently selected display option beside it
to the right. The element list is ranked by importance, with the most important element on
top, and each one lower down of less importance than the one above it.
By selecting an element and using the arrow buttons to the right of the list, you can change its
ranking. Use the Up Button to move an element one rank closer to the top with each click.
Use the Down Arrow Button to move an element one rank closer to the bottom with each
click.
Display Options
The Display Options Drop Down Menu allows you to change the display status of any
element to one of three states:
• Normal - The standard state. Element displays only when the elemental value is
greater than the limit of detection.
• Always display the results for this element. Use this state for elements critical to all of
your analyses.
• Never display the results for this element. Use this state for elements which are
unimportant to your work. This makes your instrument display less complex.
Select the element you want to change, then select the menu option corresponding to your
choice of display status. The currently selected element is displayed in white on black.
Select the Save Button to save your current status as the new default. Select the Reset button
to reset the settings to the previously saved state. Select the Close button to exit the screen
without saving.
Report Settings
Under Electronics Metals, Plastics, and Test All Modes, A field called Report Settings is
available. Selecting the triangle next to the Report Settings Field will open a pop up menu
allowing you to choose between the three Report Settings Modes. Select the mode you wish to
edit.
Changing the settings for one analysis mode will not affect the settings for other modes, and
the configurations can be saved independently.
RoHS Option
When the RoHS Option is selected, clicking on the Pass and Fail values works as it does in
any other Mode.
Detection Option
When the Detection Option is selected, Selecting the Pass/Fail field for that element acts as an
On/Off Toggle, which will switch Pass/Fail mode between On and Off for the selected
element. Selecting it again will reverse the toggle.
Virtual Keyboard
Whenever you see the Keyboard Icon, you can select it to bring up a Virtual Keyboard on
your touch screen. Generally, selecting the keys on the Virtual Keyboard will type the
corresponding character into the field. The exceptions are the meta-keys Del, Clear, Left,
Right, Shift, Backspace, Cancel, and Enter.
Keyboard
icon
In the Display Units area, you can select between Percent composition and Parts per Million
as the units displayed in a measurement, and you can change this setting independently for
any mode. You can also change the Sigma for each of these modes independently. When you
have changed the display units to the appropriate values, select the Close button to save these
settings for use.
Confidence Intervals
Confidence intervals assume that the data are from an approximately normally distributed
population - generally, sums of many independent, identically distributed random variables
tend towards the normal distribution as a limit. Using this assumption, about 68 % of the
values must be within 1 standard deviation of the mean, about 95 % of the values must be
within two standard deviations, about 99.7 % must lie within 3 standard deviations, and
about 99.99% of the values must lie within 4 standard deviations.
The greater the sigma value of the test, the more confident you can be that the sample is as it
appears, but the more difficult and time consuming the testing must be to verify this. That's
why it's important to use the most appropriate sigma value for the test. By adjusting the sigma
value for each type of test, you can optimize the process for your needs.
When you have changed the sigma values to the appropriate number, select the Close button
to save these settings for use.
Multi-Range tests are used to either preferentially excite specific elements for increased
sensitivity, or to cover a wider element range than one Range alone can provide. The XL2 Plus
analyzer is has 2 element ranges: Main and Light.
• Main Range is used for the analysis of most elements (Ti to U)
• Light Range is typically used for the analysis of light elements (Mg, Al, Si, S, P).
Select the mode you wish to configure from the Mode Menu. You can set different
configurations for different calibration modes.
The Element Range Screen enables you to directly enable or disable a Range, or control the
time that a Range alters the irradiation of the sample before auto-switching to another Range.
In typical metals analysis applications, Main Range is used for the analysis of most elements.
You cannot deselect the Main Range in metals analysis.
Select the Element List Button - labeled with a question mark - to display the Element List for
that Range. This list shows the elements that the Range is best designed to detect.
Select the Range Time field for the intended range to change the switch time for that range.
The Range Time Editor will appear. This enables you to set the number of seconds each
enabled range is allotted before auto-switching will occur when needed during sample testing.
Your analyzer will auto-switch from one range to another when the testing time for that range
is greater than or equal to the time you have chosen, and the identified alloy is flagged as
needing the switch in the Niton Alloy Library.
Select the C button to clear the current time, then from the virtual numeric key pad, select
each digit you want to input, then select the E button to enter.
From the System Menu, select the Date & Time icon from the System Screen to set the date
and time as needed for different time zones, daylight savings time, or any other reason. The
date and time are factory preset prior to shipping. The clock is a 24 hour clock, so add 12 to
PM hours - i.e. 1:13 PM would be 13:13.
When the Date & Time button is selected, the Date & Time Screen comes up on your
analyzer’s LCD Screen. You may change the Month, Year, Date, Hour, and Minute on your
analyzer.
Proper calibration of the touch screen to the display is important for touch accuracy. If you
find the touch screen does not respond to the buttons you touch, you should re-calibrate it.
Select the Calibrate Touch Screen button from the System Screen to re-calibrate the analyzer's
touch screen display. This procedure establishes the display boundaries for the touch screen
interface.
2. The display will show a message asking you to confirm whether or not you want to
calibrate your Touch Screen. Select the Yes button.
3. The display will show the message: “Calibrate Touch Screen”. There will be a small cross in
the upper left-hand corner of the display.
4. Tap on this cross with the stylus, and the cross will disappear and reappear in the upper
right-hand corner of the screen.
5. Tap on the cross again, and it will reappear in the lower right-hand corner of the screen.
6. Tap on the cross again and it will reappear in the lower left-hand corner of the screen.
7. Tap on the cross once more, and you will be presented with a Confirmation Screen.
8. Select the Yes Button to confirm that the parameters are good. Select the No Button to start
the process again.
9. Once you have confirmed the parameters, the System Menu will be displayed. The screen is
now calibrated.
The Touch Screen can be calibrated - and the system operated - with a USB mouse plugged
into the USB ports in the rear of the analyzer.
Analysis Modes
Contents
• “Analysis Modes Descriptions” on page 45
• “Metals Analysis” on page 48
• “General Metals: Standard Operating Procedure” on page 50
• “Taking a General Metals Reading” on page 53
• “Precious Metals Analysis” on page 55
This mode will attempt to return an Alloy Grade Identification by matching the analyzed
composition of the sample with the nominal composition of alloys in the analyzer's Alloy
Grade Library. It will also return an elemental composition of the alloy as analyzed. Alloy
Composition is output by default in terms of percent of composition by weight.
This mode will attempt to return an Alloy Grade Identification by matching the analyzed
composition of the sample with the nominal composition of electronic alloys in the analyzer's
Alloy Grade Library. It will also return an elemental composition of the electronic alloy as
analyzed. Electronic Metal Composition is output by default in terms of percent of
composition by weight.
Plastics Mode
Use this mode to analyze samples composed primarily of plastic. This mode will return an
elemental composition of the plastic sample as analyzed. Plastic Composition is output by
default in terms of parts per million.
Soils Mode
Use this mode to analyze samples composed primarily of soil and rock. This mode will return
an elemental composition of the soil sample as analyzed. Soil Composition is output by
default in terms of parts per million.
TestAll Mode
Use this mode to analyze samples composed of unknown and/or mixed composition, such as
toys and consumer products. This mode will attempt to return a general Material
Identification by comparing the analysis with other general types of materials. It will select the
proper sub-mode for analysis and return an elemental composition of the sample as analyzed.
Material Elemental Composition is output by default in terms of parts per million.
Note Due to the nature of this mode, your analyzer will only use factory calibrations. User
modified Cal Factors will not be available.
Metals Analysis
To analyze metal samples, from the main menu select sample type, and then click on the
Metals icon. Once in the Metals Selection Screen there will be 1 to 5 Metals Calibration
Modes (depending on the model number and on optional calibrations purchased).
Element List
buttons
Preparatory Tasks
1. Attach a charged battery to the analyzer and turn it on. Follow the screen instructions and
“Log On” as the operator using either the default password or a custom one as designated
by the user in an NDU file.
2. Verify that the date is set properly for data tracking purposes.
From the Main Menu, select the System icon, then the Specs icon. The date will be
displayed for verification. If the date is incorrect, correct it prior to proceeding. This can
be done by “Closing” out of the Specs screen and selecting the Date & Time icon.
Detailed information on this procedure is available in Setting the Date and Time.
3. (Optional) Connect the analyzer to a computer via the included USB cable. (Consult
“Using a USB Cable to Connect Your Analyzer” on page 136 if necessary.)
4. During analysis and detector calibrations, it is important to ensure that the analyzer is not
exposed to strong electromagnetic fields, including those produced by computer
monitors, hard drives, cellular telephones, walkie talkies, etc. Keep a minimum two (2)
feet (0.7 meters) distance between the analyzer and electronic devices.
5. From the Main Menu, select System Check icon then the Yes button.
b. If the unit shows a failure error, then perform a second System Check by clicking
Recheck. If the unit still does not show a “System OK,” please contact Thermo
Scientific. See “Contact Us” on page 1.
6. Thermo Scientific Niton XL2 Plus analyzers are equipped with excitation filters that
optimize the analyzers’ sensitivity for various elements. The “Main Range” filter provides
optimum sensitivity for the elements titanium (Ti) through uranium (U). The “Light
Range” filter is used to optimize sensitivity of light elements, magnesium (Mg),
aluminum (Al), silicon (Si), phosphorus (P) and sulfur (S). The amount of time that the
analyzer spends in each filter position is user definable, but the default settings should be
used unless there is reason to change them. Please note that the analyzer will continue
alternating excitation filters until the user selectable maximum analysis time is reached or
the operator terminates the measurement.
Verify Accuracy
Verify instrument measurement accuracy using the 2 reference materials (RM) supplied with
the analyzer.
1) Reference material (1.25Cr 0.5Mo). Test this sample for 30 sec on main range filter only
2) Al 6061 check sample. Test 5 sec on main range and 30 sec on light range filters.
If the sample is correctly identified and all major elements read within calculated acceptance
limits (within the low and high values of factory readings found on the QC sheet, proceed to
General Testing Protocol section
If the analyzer reports values outside the acceptance tolerance ranges specified in the tables,
repeat the detector calibration then repeat the reference sample analysis.
If the analyzer again fails to meet the acceptance tolerance ranges specified in the tables, please
contact Customer Support. See “Contact Us” on page 1.
The analyzer will often display a correct alloy identification and/or accurate chemistry result
before the specified time interval. If the accuracy meets the user’s requirements, it is not
necessary to measure for the full time. Longer measurements might be necessary if low
concentrations of elements must be determined.
INSTRUMENT QC
Measure the supplied reference calibration check sample AT LEAST once a shift. If correct,
continue work. If incorrect, redo System Check and re-take the past 2 hours of results.
For samples that do not fully cover the measurement aperture, increase the testing time by
increasing the time in inverse proportion to the decrease in percentage of aperture covered.
For example: a rod only covers ½ of the aperture, so increase the measurement time by two
(e.g., from 10 to 20 seconds per filter for alloy chemistry).
The best procedure to measure undersized samples is to use the Thermo Scientific Niton
portable test stand (optional), which is shielded to prevent radiation exposure to the operator.
An undersized sample may alternately be measured while lying on another material. Results
may be affected by the signal coming from the underlying material itself. Use only pure
aluminum, pure plastic, or clean wood and employ the Disable Al feature. Use the Tools
Menu, then select Disable Al, and check the underlying surface itself to be sure no metals are
present. Be sure to use the Tools Menu and select Enable Al before testing aluminum alloys.
a. The Info screen lists the current mode and library being used, plus configuration
info.
b. Select Data Entry if you wish to do any data entry.
c. Select Tools to configure or customize the mode.
d.
6. Initiate a Reading by pressing the trigger.
7. When the sample has been sufficiently analyzed, release the trigger.
Select the Sample Type icon from the Main menu, then the Metals icon, then the Precious
Metals icon to enter Precious Metals Mode.
When testing metals with Au in their composition, Precious Metal Analysis returns the
metal's Karat rating.
Note - in order to use the Karat analysis with the Niton XL2 Plus analyzers, you need to set up
Karat as a pseudo-element first.
Figure 19. Precious Metals Analysis of 24 Karat Gold Showing Karat Rating
When testing metals with no Au in their composition, Precious Metal Analysis returns the
metal's Karat rating as nd (Not Detected).
When testing metals with Au present, Precious Metal Analysis returns the metal's Karat rating
as rounded to the nearest Karat at the top of the list. Where the Au is listed on the Complete
List of elements, the Karat rating is displayed in full.
AuDIT TM
The AuDIT algorithm determines whether or not a surface is plated. AuDIT can detect
plating as thick as 8um. Since most plating is less than 2-3um, this can usually detect plated
objects. Heavily plated objects with a plating greater than 8um thick will read as Gold Plate
Not Detected.
Starting/Stopping AuDIT
AuDIT can be toggled on or off from your Tools Menu. This toggle is only available in
Precious Metals Mode. Selecting the AuDIT:Off button will turn AuDIT on, and change the
button to “AuDIT:On”. Selecting the AuDIT:On button will turn AuDIT off, and change the
button to “AuDIT:Off ”.
Figure 22. Precious Metals Tools Showing AuDIT:Off and AuDIT:On Buttons
AuDIT uses four separate tests run automatically to determine whether or not a sample is
plated.
1. The first test is a patent-pending method only available on Thermo Scientific Niton XRF
analyzers. This is an iterative comparison of x-ray intensity signatures, which, when it fails, is
the most likely indication of a plated item.
2. Nickel is often used as a pre-plate, and high proportions of Ni in a reading are a good
indicator of plating.
3. Platings often have a low Karat value when averaged with the substrate, so Karat values of
less than 9 are flags indicating plating.
4. A Karat rating that is not one of the standard Karat percentages - within 0.5 karat of 9kt,
10kt, 14kt, 18kt, 22kt, or 24kt (referred to as Out of Plumb) - also may indicate the presence
of plating.
Only if the sample passes all four tests is it labeled “Gold Plate Not Detected”. This does not
mean that there is no plating, but that the presence or absence of plating cannot be
determined by the analyzer.
AuDIT Messages
If AuDIT is not enabled, a white on black message stating “AuDIT Disabled” will display on
the Results Screen while in Precious Metals Mode.
If AuDIT detects what looks like gold plating on the material, a black on red message stating
“Gold Plate Probable” will display on the Results Screen.
If AuDIT detects what may be gold plate, but isn't sure, a black on yellow message stating
“Gold Plate Suspect” will display on the Results Screen.
If AuDIT detects what is either unplated gold or very thickly plated gold, a black on white
message stating “Gold Plate Not Detected” will display on the Results Screen.
If AuDIT finds too much Nickel in the sample, a black on yellow message stating “High Ni
Content” will display on the Results Screen.
When AuDIT finds a Karat rating less than 8.5 Karat in the sample, a black on yellow
message stating “Low Karat” will display on the results Screen.
When AuDIT finds a Karat rating other than the standard Karats, a black on yellow message
stating “Non-Standard Karat” will display on the results Screen.
Advanced Settings
Contents
• “Adjusting the Element Range” on page 66
• “Tools Menu Options” on page 72
• “NDF Files: User Data Structuring” on page 93
• “Safety Settings” on page 105
• “Camera” on page 111
Multi-Range tests are used to either preferentially excite specific elements for increased
sensitivity, or to cover a wider element range than one Range alone can provide. Most modes,
when enabled, will use two Ranges in sequence to produce a combined analysis result. In
typical alloy analysis applications, Main Range is used for the analysis of most elements, Low
Range is utilized for the subsequent high sensitivity analysis of V, Ti, and Cr, and Light Range
is available only with 900 series GOLDD technology analyzers, and is typically used in light
element analysis. Multi-Range switching can be set to activate off time alone, or, when time
switching is disabled, off settings in the General Metals grade library. In most modes, Low and
Light Range add the capability to analyze light elements which cannot be efficiently excited by
Mid Range.
Select the mode you wish to configure. You can set different configurations for different
modes.
The Element Range Screen enables you to directly enable or disable any Range, or control the
time that a Range alters the irradiation of the sample before auto-switching to another Range.
Select the checkbox next to the Range you want to use to determine exactly which of the
Ranges contained in your Analyzer is used for sample testing. Selecting an empty checkbox
will enable that range and place a check into the box as an indicator. Selecting a checked box
will disable the Range and clear the box.
In typical alloy analysis applications, Main Range is used for the analysis of most elements.
You cannot deselect the Main Range in alloy analysis
Low Range is utilized for the subsequent high sensitivity analysis of V, Ti, and Cr.
Select the Element List Button to display the Element List for that Range. This list shows the
elements that the Range is best designed to detect.
Select the Range Time field for the intended range to change the switch time for that range.
The Range Time Editor will appear. This enables you to set the number of seconds each
enabled range is allotted before auto-switching will occur when needed during sample testing.
Your analyzer will auto-switch from one range to another when the testing time for that range
is greater than or equal to the time you have chosen, and the identified alloy is flagged as
needing the switch in the Niton Alloy Library.
Select the C button to clear the current time, then from the virtual numeric key pad, select
each digit you want to input, then select the E button to enter.
Alternate
Tools menu
Main Tools
menu
Avg Forward
Enables you to average different readings together from this analysis forward. Select the Avg
Forward button to initiate future sample averaging. Avg Forward will set up an automatic
personal averaging protocol to be followed until your analyzer is shut down, or this feature is
disabled. To begin, select the number of readings you want to average from the virtual
numeric keypad. Your analyzer will calculate an average reading after that number of tests, and
continue this pattern until stopped. If you select to average forward 3 readings, and you take 3
readings, the analyzer will store the individual readings. Analyzer will then automatically
calculate the average of the 3 readings and store an averaged reading.
Enables you to average different readings together from this analysis backward. Select the Avg
Back option to initiate backwards sample averaging. Avg Back will take the number of
readings you select and average their analytical results. The range is counted from the last
reading backward by the number of readings selected. If your last reading was #15, selecting 3
would average readings #13, 14, and 15. The average is calculated, displayed, and stored into
memory as the next sequential reading number, in this case, #16.
Note You cannot average readings taken in different modes. Doing this will generate an error.
Spectrum:On/Spectrum:Off
The Tools Menu contains a toggle option to display live spectra as sample analysis occurs.
Print (Alt)
Select the Print option from the Tools Menu to print the current analysis screen to any
attached Bluetooth printer. If you do not have a portable printer attached to your analyzer,
nothing will happen.
Set Pass/Fail
You can set up your analyzer to sort on a pass/fail basis. Pass/Fail uses the chemistry of a
user-generated list of alloys in the library as a basis for comparison. If the sample analysis is
entirely within the specifications for one of these alloys, a PASS result is given, otherwise a
FAIL result is returned. To turn on Pass/Fail, select the Tools Menu and select Set Pass/Fail
from the menu. The Pass/Fail Setup Screen will come up.
Add/Remove (Toggle)
Select alloys from the Available list and then the Add Button to move the alloy to the Selected
List. Select alloys from the Selected list and then the Remove Button to remove the alloys
from the Selected List.
Pass
Select the Pass Single button to initiate Pass Mode. Use Pass Mode when you have a desirable
match. If the alloy being analyzed matches one of the alloys in the selected list, the alloy will
Pass the analysis.
Fail
Select the Fail button to initiate Fail Mode. Use Fail Single Mode when you have an
undesirable match. If the alloy being analyzed matches one of the alloys in the selected list, the
alloy will Fail the analysis.
Type the name of your reference alloy into the Virtual Keyboard, and the left column will
display any matches. Select the match you want and the Add button to make it your reference
alloy.
Pass/Fail compares the chemistry to that of the alloy(s) selected, using the cutoff you selected.
When the sample analysis reaches a match with the chemistry of any one of the alloys on the
Selected list, a PASS or FAIL notice is generated as appropriate.
Alloy Display
The Alloy Display selection allows you to display the number of alloy library matches
displayed during analysis and in readings.
• Hide Alloy Matches - displays no match at all
• Show 1st Match - displays the best alloy match
• Show 1st & 2nd Matches - displays the 2nd alloy match when the match quality between
the 2 alloys is close in value
Match Quality
When Alloy Display is set to show matches, the match quality is color coded as shown below.
Enable/Disable Al
Normally, the collective amount of unquantifiable light elements in alloy analysis - the
“balance” - is assumed to be aluminum and labeled as such in the analysis. Selecting the
Disable Al button from the Tools Menu will delete this “aluminum” from the analysis results,
showing only the quantified elements. Selecting the Enable Al button, the default state, will
label this “balance” as “aluminum”.
Thickness Correction
Thickness Correction is used only in Plastics mode. Plastics, and polymers in general, unlike
metals or soil, are very weak absorbers of X rays. This is because polymers are composed
mainly of very light elements such as carbon and hydrogen. While just half a millimeter of
steel will completely stop 23.1 keV energy X rays of cadmium, for example, it takes at least
10mm of plasticized PVC and as much as 100mm of polyethylene (PE) to do so. Fortunately,
polymers that may contain cadmium (Cd), lead (Pb) and other restricted elements would also
contain considerable quantity of elements such as antimony (Sb), bromine (Br), titanium
(Ti), etc. Their presence results in much stronger absorption of X rays which means that,
instead of 100mm, it takes only about 15mm of compounded PE to achieve saturation
thickness for these X rays. If the thickness of analyzed polymer sample is less than 5mm for
PVC or less than about 9mm for a “typical” PE, the measured intensity of X rays will be a
function of both analyte concentration and sample thickness. This is why measurements
performed on thin samples (less than saturation thickness) need to be corrected for thickness.
Figure 44. How to Enable and Adjust Thickness Correction for Plastics Analysis
Whenever possible, one should analyze as thick a sample as available. For example, if the
analyzed object is a piece of heat-shrink tubing with wall thickness of 0.3mm, the best way to
analyze it is to obtain several pieces of the tubing (four for example) and stack them like a flat
sandwich, with the thickness correction set to 1.2mm. Doing so makes for faster and more
precise analyses. While it would be possible to analyze just a single layer of the tubing with
correction at 0.3 mm, by stacking several layers we reduce the relative error of measurement
(by a factor approximately equal to the square root of the number of layers). Conversely, when
analyzing thinner samples, we need to extend the measurement time fourfold (by the number
of layers) in order to maintain the same relative error of measurement. We can see how
quickly measurement time would escalate to impractical levels for thinner samples.
Examples: The most frequent instances in which thickness correction would be called for are
analyses of plastic sheeting or plastic insulation on wires and/or cables and heat shrink tubing.
Flat plastic sheeting or plastic enclosures pose no problems. We can either analyze an object
“as is”, or stack several layers of it before analysis. Plastic insulation such as that on wiring or
cables requires a little more sophisticated approach. First, the wire must be removed so that
only insulation is analyzed. Then, the insulation should be flattened for analysis, and a
thickness correction should be applied that is equal to double the wall thickness. Alternatively,
if the insulation is stiff, it should be cut lengthwise into strands which are placed on the
instrument for analysis. The applied thickness correction should be equal to the wall thickness
of the sleeve. Both operations are shown in Figure 37 and Figure 38.
A piece of large diameter heat shrink tubing presents an interesting case. It is tempting to
analyze this object as is - see Figure 39. However, one needs to know that while lead or
bromine or chromium X-rays from the upper wall of tubing will not contribute to the signal
measured, X rays of such elements as cadmium, antimony, tin or barium in the upper wall will
significantly contribute to overall signal. It is therefore imperative to either flatten the tubing
for analysis or cut it in pieces and then analyze as shown in Figure 40.
Enable/Disable Paint
Selecting the Enable Paint option from the Tools Menu will enable the Painted Products
mode and toggle the option to Disable Paint. Selecting the Disable Paint option will disable
Painted Products mode and toggle the option to Enable Paint.
Action Level
Action Level is available only from the Lead Paint modes. Selecting the Action Level option
from the Tools Menu will enable you to change the action level used for qualitative testing.
Print Data
Selecting the Print Data option from the Tools Menu will print the currently displayed data to
the optional attached printer. See “Print (Alt)” on page 75 for details.
Coatings Method
Metals are sometimes coated with various materials. If you wish to analyze the coating, select
the Coatings Method. Coatings Analysis Mode is an optional mode which can be purchased
and added to your analyzer.
7. Enter a user name and password, then select the privileges assigned to this user. Selecting
the Check All check box will result in enabling all features.
a. You are now ready to upload your password file to the analyzer.
9. Be sure the analyzer is switched on; connect the analyzer using USB or serial connection.
10. Select the Upload icon.
This will create a new window in which you can create your own fields, and specify their
structure and parameters. The new window will appear with a single box, called “Untitled.”
By right-clicking on this box, you can access a pop-up menu allowing you to set the mode of
the new data fields. Select New Mode to access the menu.
The Mode you select will be the Mode within which the new data entry fields will appear. If
you have multiple Modes enabled on your analyzer, the new fields will only be available from
the Mode you select. Only the default fields will be available from the other Mode or Modes.
When you select the Mode for the new data fields, the Construction Window will change to
look like this:
The “M” indicates the mode you have chosen - in this case Alloy Mode. Right click on the
Mode name to access a pop-up menu.
Select New Field from the menu, and a blank new field will appear in the construction
window.
Right clicking on the New Field box will bring up another pop-up menu. This menu gives
you various options for using the field in your operations.
Selecting Required makes it mandatory that the new field be filled in prior to taking a
measurement. This is very useful for necessary descriptors which vary from measurement to
measurement, such as lot numbers, condition descriptors, locations, etc.
Selecting the Incremental option sets up a field which increments the field descriptor by one
for each measurement taken. This option is handy for measuring several items with identical
descriptors, such as samples within a single lot, or several instances of the same part number,
because it appends the incremental number to the descriptor.
Selecting Clear Every Reading will toggle between two states. By default, the field will fill with
the data which was input during the last reading. By selecting Clear Every Reading, you tell
the instrument to clear the data from the field for each new reading, insuring that the person
taking the reading must input new data each time. This is very useful for times when the data
descriptor is expected to vary widely between readings.
The state of each of these options can be seen in the Field Status Window at the bottom of the
Construction Window. All options in effect for the field selected are checked.
This shows a field with no options in effect, the default configuration. This is a field that will
present the previous reading’s data for this field - which may be changed by the user - without
incrementing it, but does not require the user to input any data if there is none already there
from a previous reading.
This shows a field with both Required and Clear Every Reading options in effect. This
presents a field that is cleared for each reading, and must be filled in by the user before a
reading is taken.
Selecting Edit from the pop-up menu allows you to edit the name of the field in the Editing
Window to the right of the Construction Window.
Selecting the box to the left of the field toggles the Required option on or off.
Selecting Copy from the pop-up window allows you to copy the currently selected field.
Once you copy a field, the Paste option can be selected to paste the copied field into the
Construction Window.
Selecting the New Entry option from the pop-up menu allows you to define a choice for the
user for this field.
The “E” is for “Entry.” You can edit the entry once it is created, the same way as you edit the
field name. Right click on the entry name, and choose Edit from the pop-up menu.
You can sort your entries by name, alphanumerically, by right clicking on the field and
selecting “Sort” from the pop-up menu.
To delete a field or entry, just right click on the item you wish to delete, and select Delete
From the pop-up menu.
When you are finished creating your new NDF file, Upload it to your instrument using the
Upload icon.
Make sure the instrument is connected to your computer by testing the connection first. Use
the Test button on the Upload Window.
Safety Settings
Access to the Safety Settings Screen is blocked unless the user logging in has explicitly been
granted Safety access. The default login of 1234 does not have Safety access. See Passwords
and User Privileges.
The Safety Settings Screen enables you to change the Method of Operation for your
analyzer. Each checkbox on the screen enables or disables the safety device named for purposes
of the preconditions for operation. For example, checking the Proximity Button Required
checkbox sets the engagement of the Proximity Sensor as a precondition for operation.
Checking the Proximity Button Required checkbox and the Interlock Button Required
checkbox sets the engagement of both the Proximity Button and the Interlock Button as
preconditions for operation.
Safety settings always override start-stop settings. If your safety setting requires the use of the
Proximity Button, you cannot set start-stop settings which ignore the Proximity Button. For
example, the Easy Trigger start-stop setting must have the Backscatter safety setting enabled.
While using Easy Trigger, you cannot disable Backscatter.
WARNING The backscatter sensor is enabled by default and acts as a recommended safety
feature for most applications. Some sample types, however, cannot be analyzed when this
feature is enabled. Samples that present very little mass to the analysis window, such as certain
thin films, thin layers of plastic, and very thin wires, may not be of sufficient mass to allow the
analysis to continue while backscatter is enabled. One should disable the backscatter feature
only when necessary to analyze such low mass samples, and re-enable the feature when
finished with these sample types. These samples also provide very little absorption of the
primary x-ray beam so it is typically most appropriate to analyze these samples in a test stand
when possible.
Start/Stop Setup
The Start/Stop Setting Screen enables you to change the preconditions for operation at a
lower level than the Safety level. See Safety Settings for more information. Start/Stop settings
cannot contradict Safety settings.
The Start/Stop parameter options are Proximity Start and Remote Trigger. There is also a field
to set the maximum time for sample analysis before the analysis stops.
Proximity Start
Select the Proximity Start checkbox to use the Proximity Start parameters. Using Proximity
Start, once the reading has been started, release of the Proximity Button will immediately stop
the analysis. You cannot use Proximity Start with Easy Trigger.
Remote Trigger
Select the Remote Trigger checkbox to use the Remote Trigger parameters. Remote Trigger is
used with the Extend-a-Pole accessory to control the analysis. With the Extend-a-Pole's input
cable connected to the analyzer's Remote Trigger port, you can initiate and stop analysis
remotely from the Extend-a-Pole's handle trigger. You can use Remote Trigger with either
Proximity Start or with Easy Trigger.
Select the Max Time field to change the maximum analysis time parameter.
Methods of Operation
CAUTION After being powered on, your Niton Analyzer will perform an internal
re-calibration before an analysis is initiated.
There are six different methods of operation for taking a sample measurement, and your
analyzer will be configured to use one of those methods for alloy samples, depending on the
regulatory requirements of your locality. These methods are:
• Trigger-Only method. With the Trigger-Only method, you only need to place the
measurement window flush with the sample to be analyzed and pull the trigger for sample
analysis to be initiated.
• Trigger-and-Proximity-Sensor method. With the Trigger-and-Proximity-Sensor method,
you must place the measurement window against the sample to be analyzed to engage the
proximity sensor on the front of the analyzer, then pull the trigger for sample analysis to
be initiated.
• Momentary-Trigger-Touch-and-Proximity-Sensor method. With the
Momentary-Trigger-Touch-and-Proximity-Sensor method, you must place the
measurement window against the surface to be analyzed to engage the proximity sensor
on the front of the analyzer, then pull the trigger. The trigger may be released and the
reading will continue until you release the proximity button, or other criteria (such as
Max Time) are reached.
• Trigger-and-Interlock method. With the Trigger-and-Interlock method, you need to
place the measurement window close to the sample to be analyzed, press and keep
pressing the interlock button at the rear of the analyzer with your free hand, then pull the
trigger for sample analysis to be initiated. The interlock button is located at the very
center of the keypad, in between the left/right, up/down arrows.
• Trigger-Interlock-and-Proximity-Sensor method. With the
Trigger-Interlock-and-Proximity-Sensor method, you must place the measurement
window against the sample to be analyzed to engage the proximity sensor on the front of
the analyzer, press and keep pressing the interlock button at the rear of the analyzer with
your free hand, then pull the trigger for sample analysis to be initiated.
• Easy Trigger method. With the Easy trigger method, you need only place the
measurement window against the sample area and pull the trigger once to initiate a
sample analysis. Your analyzer will continuously sample the backscatter, using a complex
internal algorithm, to determine if the measurement window is against a sample or
pointing to the empty air. If it finds that there is no sample directly against the
measurement window, the analyzer will stop directing radiation through the window as
soon as this determination is made.
With any of these methods, analysis will stop if any one of the preconditions are violated. For
example, with the Trigger-Interlock-and-Proximity-Sensor method, if the trigger or the
Proximity Sensor or the Interlock is released, the reading will stop immediately, and the X-ray
tube will shut down.
After your analyzer is calibrated, initiate a sample reading using the appropriate method. If
you attempt to initiate a sample reading using a different method, the analyzer will inform
you that one or more of the preconditions need to be met in order for sample analysis to
begin.
Note The LED lights will blink whenever the x-ray tube is on.
WARNING The nose should not be touched during sample testing and calibration. If an ESD
event occurs during measurement, the instrument may terminate the testing in progress and
automatically reset to LogOn screen. Any test data collected prior to reset will be lost and the
testing may have to be repeated.
WARNING The preconditions for operation must be continued for the duration of the
reading. If the preconditions are violated, the x-ray tube will turn off, the calibration shutter
will close, and the measurement will end. The LED lights will stop blinking when the
measurement is ended. The flashing of the LED lights is not synchronized to minimize
power consumption.
To end the test, simply release the trigger mechanism, or any other applicable preconditions.
Camera
The Camera feature is only usable with properly configured analyzers.
If your analyzer is equipped with an internal video camera, you can turn that camera on and
off, and turn the saving of images with the readings on and off through an interface. When
the camera is on, the image will show in the Ready to Test screen. If the camera is off, saving
of images will also be off. If the camera is on and the image saving function is also on, the
images will automatically be saved with the reading. Saving images will curtail the maximum
number of readings stored.
Selecting the empty checkbox next to Enable Camera will turn the internal camera on,
displaying the camera view in the Ready to Test screen. Selecting the checkbox again turns the
camera off. Enable Camera is enabled by default.
Selecting the empty checkbox next to Save Image will enable image saving with the analysis.
Selecting the checkbox again will disable automatic saving of image data. Save Image is
enabled by default.
Stored camera images from previous measurements can be viewed on the analyzer.
Data Management
Contents
• “Viewing Data” on page 115
• “Viewing Fingerprints” on page 122
• “Erasing Data, Readings, or Fingerprints” on page 123
• “Managing Libraries” on page 125
Viewing Data
Use the Data Screen to view previously taken test result readings. When the View Data icon is
selected, the Results screen of your most recent test is shown on the Touch Screen.
Using the buttons on the control panel, you may view different readings or additional data for
individual readings. Your analyzer will display the standard screen analysis. Pressing the Down
Button on the 4-way touch pad will display a complete scrolling elemental chemistry listing.
Each press of the Down Button scrolls the screen down to the next element. You can also use
the scroll bar along the right side to scroll or page through the elements.
Sorting Elements
You can sort element rows by various criteria in order to view your data in the manner you
prefer. The Sort Buttons, which double as column headings, can be used to re-sort the data in
different ways. The default data screen displays the standard sort, as defined on “Advanced
Settings / Element Sorting”. Selecting the appropriate Sort Button once toggles the sort order
to High-to-Low. Selecting the Sort Button again toggles the sort order to Low-to-High. To
return to the Standard Sort, select the Sort Button a third time.
Element Sorts
Element sorts are performed alphabetically based on the element symbol.
Composition Sorts
Composition sorts are performed numerically based on the percentage of composition, i.e.
from lowest to highest concentration, or by toggling again, from highest to lowest.
Error Sorts
Error sorts are performed based on the size of the error in the reading, i.e. from largest to
smallest error, or by toggling again, from smallest to largest.
Spectrum Graph
For any reading result, simply use the NAV Menu to gain access to the reading’s spectrum
graph. Selecting Spectra will show a graphed spectrum of this reading, called SpectraView.
SpectraView can be a useful tool for rapid, qualitative analysis of a sample. See Viewing the
Spectrum for details.
SpectraView
SpectraView enables you to visually inspect the fluorescent x-ray peaks obtained from any
sample and qualitatively identify them using the on-board software. In SpectraView Mode,
the spectrum is displayed using a linear energy scale along the x-axis, with the count rate
autoscaled logarithmically on the y-axis so that the highest peak on the screen reaches the top
of the scale.
• Ka, Kb, La, Lb, and/or Lg peaks of the three elements closest to where your cursor is
positioned on the energy scale (Bottom Right). This information is written with the
element symbol first, followed by either Ka (K shell alpha peak), Kb (K shell beta
peak), La (L shell alpha peak), La (L shell beta peak), or Lg (L shell gamma peak). An
example would be “Al Ka 1.48." To determine if a given element is present, look at
the count rate at that cursor position.
Note SpectraView cannot be used to determine exact element percentages in a sample.
Select the FIT button in the upper right hand corner of the Spectrum to fit the area of interest
to the display area.
The view of the spectrum will change to show only the area of interest.
Multiple Ranges
SpectraView can display any individual spectra, including those obtained from multiple
Ranges (filters) if you are using more than one Range. Use the NAV Menu to select which
spectrum to view.
The Spectra1 choice will display the spectrum produced by the first Range.
The Spectra2 choice will display the spectrum produced by the second Range.
SpectraView Navigation
Use the left button on the 4-way touch pad to expand the spectrum, centered on the position
of the cursor.
Use the right button on the 4-way touch pad to contract the spectrum, centered on the
position of the cursor.
Viewing Fingerprints
Select the View Fingerprints icon to view data saved as reference sample Fingerprints in Teach
Fingerprint Mode. When the View Fingerprints icon is selected, the Results Screen of your
most recent Teach Fingerprint is shown on the Touch Screen display.
Erase Readings
Select the Erase Readings icon to erase all accumulated test readings from your analyzer.
Selecting the Erase Readings icon will bring up a confirmation screen asking you “Are you
sure?” with options to select “YES” or “NO”. Selecting the Yes Button will erase all test
reading data from your analyzer. Selecting the No Button will return you to the Erase Menu.
Erase Fingerprints
Select the Erase Fingerprints icon to erase all accumulated alloy fingerprints from your
analyzer. Selecting the Erase Fingerprints icon will bring up a confirmation screen asking you
“Are you sure?” with options to select “YES” or “NO”. Selecting the Yes Button will erase all
fingerprint data from your analyzer. Selecting the No Button will return you to the Erase
Menu.
Managing Libraries
Select the Manage Libraries icon to access the Library Management Menu. The Library
Management Menu allows you to view and modify data in the Primary Library as well as the
currently loaded alternate libraries. Just select the library you wish to view or edit from the list
on screen.
The entries in the Grade Library serve as a reference for chemistry based analysis. The library
entries allow the analyzer to work properly “out of the box” without needing time-consuming
pre-analysis.
(Name in List)
Selecting the actual name of the alloy - i.e. “Fe/CS” - will bring up the Element Specification
Screen.
Add Button
Selecting the Add Button will add a new alloy to the Library. First the Alloy Name Editor will
appear, enabling you to enter the name of the new alloy.
The Alloy Name Editor is a standard Virtual Keyboard. Use it as you would any Virtual
Keyboard. Hitting the return key enters the name into the alloy list. Select the name of the
new alloy to bring up the Element Specification Screen and enter the specification of the alloy.
Del Button
Selecting the Del Button will delete the currently selected alloy. First a confirmation screen
appears.
Selecting the Yes Button will delete the alloy from the list. Selecting the No Button will return
you to the Alloy List.
Library name
Alloy name
Minimum or
Maximum
Element to be
percentages
edited
Library Name
This is the name of the library you are editing. Make sure you are editing the correct library
before proceeding further.
Alloy Name
This is the name of the alloy you are editing. Make sure you are editing the correct alloy
before proceeding further.
Element to be Edited
This is the element you need to edit for this alloy.
Minimum Percentage
This is the lowest amount of the element in question you want to be in the alloy. If the
element in the analyzed sample is any lower, the sample will not be recognized as this alloy.
Selecting the element minimum will open the Minimum Editor.
Maximum Percentage
This is the highest amount of the element in question you want to be in the alloy. If the
element in the analyzed sample is any higher, the sample will not be recognized as this alloy.
Selecting the element maximum will open the Maximum Editor.
Connectivity
Contents
• “Installing the Windows 7 USB Driver” on page 131
• “Using a USB Cable to Connect Your Analyzer” on page 136
• “Downloading Data” on page 136
This section discusses how to connect your computer and your analyzer, for data transfer and
integration, translation to other formats, data storage and security, as well as controlling your
analyzer remotely through your computer. Connection can be achieved via USB.
2. Click on “Control Panel” and locate the “Device Manager”. If it is not available directly
under “Control Panel”, look under “System and Security” then “System”.
9. Click “Browse” button; select CD drive or the location of the driver if you are not
installing from the NDT CD (recommended).
12. A Security Dialog Box will appear. Select “Install This Driver Software Anyway?”
Downloading Data
Standard Download
To download data you have collected off line:
1. Make sure that the XRF Analyzer is connected to your computer.
2. Turn on the XRF Analyzer.
Note Wait at least 30 seconds after turning on the XRF Analyzer to begin downloading files.
The System Start screens do not allow downloading.
3. Start Niton Data Transfer.
NOTE: Niton Data Transfer and NDTr cannot both be open at the same time.
4. Click the Download button. The Download dialog box will open.
10. The download generates a data file containing the selected readings. To save the file for
later use:
c. Enter the path for the file in the Destination Folder field. You can use the ... button
to browse.
When the progress bar shows that all the readings are downloaded, click the Done button.
You should now see the readings you selected for download displayed, one reading per
horizontal line. The data has been saved to the folder and filename you indicated prior to
downloading. If an error message has appeared, see the following section.
You can also automatically save reports in .csv format for importing into Excel or other
programs.Simultaneous Save as CSV File
• Your Niton analyzer must be turned on and connected to the PC. See Using a USB Cable to
Connect Your Analyzer.
• The NDTr program module must be running and connected to your analyzer. See
Operating Your Analyzer Remotely.
The file created is in a format readable by the NDT program module, has an extension of
.ndt, and looks identical to a file of manually downloaded readings - see Standard Download.
It can also create a simultaneous .CSV file. Simultaneous Save as CSV File .
3. Live Download does not overwrite any previous readings in the file. If you want to do
this, you must first explicitly erase the file before initiating Live Download.
4. Live Download does not retroactively add any readings taken while your analyzer was
disconnected.
You can change the destination file or folder by clicking in the appropriate text box and typing
in the new file name, or by clicking on the browse button (...) to the right of the text box and
selecting a different pre-existing filename. To implement these changes, click the OK button.
Your instrument serial number is associated with the file. If a different instrument is
connected and Live Download is started, a message will appear saying that the connected
instrument and file instrument do not match, and Live Download will not start. Saving the
session as a new file will alleviate this issue
NOTE: NDTr and Niton Data Transfer cannot both be open at the same time.
Start Measurement
Clicking this icon will initiate a measurement in whatever mode the analyzer is in currently.
Stop Measurement
Clicking this icon will halt any ongoing measurement on the analyzer.
Connect
Clicking this icon will attempt to establish a connection between your computer and your
analyzer.
Disconnect
Clicking this icon will disconnect your computer from your analyzer.
Live Download
See Live Download from NDTr
Select the proper com port from the list, then select the OK Button.
Dest Folder
This field shows the last used save folder, defaulting to the folder where NDT is installed.
Selected File
This shows the filename the reading will be saved to unless you change it.
Always Show this Dialog Box when the File button is Pressed Checkbox
Selecting this checkbox will enable you to change the filename whenever you want.
Deselecting this checkbox will save the file under the same name in the same folder every
time. The checkbox is selected when there is a check in it, and deselected when it is empty.
Contents
• “Replacing the Measurement Window” on page 153
• “Tips and Troubleshooting” on page 156
• “Storing and Transporting Your Niton XL2 Plus Analyzer” on page 162
This section of the User guide is about getting the most out of your analyzer. We cover
troubleshooting your analyzer by using the Specs screen. We also cover advanced topics like
setting thresholds, using the Tools menu, correcting for light elements in the sample
composition, setting up pass/fail analysis, changing safety and start/stop parameters, and
many other special situations you may need. We have also included a number of documents
for reference, so you can learn more about XRF analysis if you are so inclined.
Pick one
• When the bracket is clean, remove the backing from the Measurement Window. Place the
window on the Bracket gently. Make sure the opaque portions of the window do not intrude
over the large measurement hole in the Bracket.
Figure 126. Removing the Backing from Prolene Window (Left) and Applying Window to Bracket (Right)
CAUTION Do not use fingers to press window into place! Use a smooth, hard surface such as back of
tweezers.
Select the Specs icon from the System Menu to display the analyzer's specifications. These
specifications include your analyzer's serial number, software and firmware versions, and other
data. Press the Close Screen Button to return to the previous menu. Press the “>-” Screen
Button to go to the Diagnostic Menu, and press the “<-” Screen Button to return to the
Specifications Screen.
On the Specs Screen, standard information on the state of your analyzer is shown for your
reference. This information should be reported to Service if there is a problem.
Specs Information
The following is the information supplied on the Specs Screen:
Model Number
This is located in the right part of the blue band at the top of the screen.
SW Version
This is the currently loaded software version, which should be reported to Service if there is
any problem.
FPGA
This is the currently loaded FPGA software version, which should be reported to Service if
there is any problem. FPGA versions are always a four digit number. Any other number of
digits may be a sign of a problem in the FPGA code.
Factory QC
This is the date that the machine was QCed at the factory.
Energy Cal
This line notes the last time a System Check was performed.
Battery
This line gives the proportional charge remaining to the battery.
Diagnostics
Select the “->” Screen Button to load the Diagnostics Screen. The Diagnostics Screen shows
Detector Temperature, Bias, Cooler Voltage, SubBias, Energy Scale, and Temperature in C
and F scales.
The Diagnostics Screen can be of great utility in assuring proper operation of your analyzer.
Det Temp:
Detector Temperature should be within this range:
- 25 + or - 5 degrees
Bias:
Bias should be within this range:
175 + or - 10
VCool:
VCool will vary with the ambient temperature.
SubBias:
SubBias should be within this range:
-11 + or - 3
Escale:
Escale should be within this range:
Preamp:
Preamp value should only be noted, and reported to Service if there is a problem.
FAQ
Q: What is the max mA, max kVp, and max power?
A: Maximum mA is 200 uA
A: This should be filled out as not applicable N/A as it does not apply to Niton XL2 Plus
analyzers.
A: States differ greatly in their categories; the following is a list of common categories:
• X-Ray Fluorescence
When selecting the category make sure that you don't select medical or radiographic.
A: One.
A: The serial number of the tube can be found on the Calibration Certificate that is included
in the shipping case with your analyzer.
A: The serial number of the tube can be found on the Calibration Certificate.
Q: How often do I need to perform leak tests on the Niton XL2 Plus?
A: Never. Leak tests are only required for analyzers with radioactive isotopes. Niton XL2 Plus
analyzers do not have radioactive isotopes.
All padlocks are shipped with a default combination of “0-0-0”. If you change this
combination, please inform Thermo of the new combination if you return the unit for
service.
5. Pull shackle out and rotate it 180 degrees and secure it. Your lock now has its own secret
combination.
CAUTION Always transport the unit in its padded carrying case, and store the Niton Analyzer
in its case whenever it is not being used.
CAUTION Within the United States, always keep a copy of the US DOT compliance
statement in your Niton analyzer case at all times. A copy is included with your analyzer.
CAUTION Always follow all pertinent local and national regulations and guidelines, wherever
your analyzer is transported or used.
CAUTION Always obtain a Return Authorization (RA) number from Thermo Fisher
Scientific’s Service Department in the United States, toll free, at (800) 875-1578, or outside
the United States, at +1-978-670-7460 before returning your analyzer to the Service
Department or to your local Authorized Niton Analyzers Service Center.
CAUTION If you return your Niton analyzer without the carrying case, you will void your
warranty in its entirety. You will be billed for a replacement case plus any repairs resulting
from improper shipping.
CAUTION Always remove the battery pack when transporting or storing your analyzer.
Test Stands
Contents
• “The Portable Test Stand” on page 165
• “The Mobile Test Stand” on page 173
• “Configuring the Analyzer for the Test Stand” on page 175
• “The Field Mate Test Stand” on page 181
• “Configuring the Analyzer for the Test Stand” on page 189
2. Lift up the top part of the Portable Test Stand. As you do so, the scissored legs will rise up
and come together.
3. Hook the notch in the back, flat part of the top over the bar between the rear legs, as in
Figure 134.
2. To take down the Portable Test Stand, unhook the back, flat part of the top section of the
Test Stand. The hood locks shut with a sliding tab for transportation.
CAUTION When setting up and taking down the Portable Test Stand, be aware of possible
pinch points.
3. The air pistons between the scissors legs will let the top down slowly.
4. Support the top until the legs have fully collapsed. Figure 136 shows a Portable Test Stand
fully collapsed.
Figure 137. Plugging the Remote Trigger Cable Into the Portable Test Stand
Figure 138. Plugging the Remote Trigger Cable Into the Analyzer
Figure 139. Plugging the USB Cable into the Portable Test Stand
Figure 140. Plugging the Power Cable into the Portable Test Stand
Open Closed
Your analyzer cannot be inserted fully in any orientation but the correct orientation, with the
touch screen towards the front.
To remove your analyzer, simply squeeze the tabs on either side of the cone and pull the
analyzer down, gently but firmly, until it separates from the cone.
When the hood is lifted, your analyzer will display a red icon at the top of the touch screen, as
in Figure 143.
Your analyzer will not take a reading unless the hood is shut and the green icon is on.
2. Push the pop-up peg in with your thumb as shown in Figure 144.
3. Slide the leg into the main body until the pop-up peg snaps into the matching hole as
shown in Figure 145.
5. Adjust the height of each leg by screwing or unscrewing the feet to allow a level platform
7. Place your analyzer underneath the Mobile Test Stand so that the nose points up and the
touch screen is facing you, as in Figure 146.
8. Lift your analyzer firmly up into the recess until the clips snap solidly into place on both
sides of the analyzer’s front end.
Press the trigger to initiate the reading, and disengage the Proximity Button by opening the
cover to stop the reading. With this feature enabled, you do not need to hold the trigger down
while the sample is being analyzed.
You can always use standard operation, as well as remote operation via NDTr.
To measure small alloy parts and samples, place the part to be measured directly on the
measurement window of the analyzer, shut the hinged top firmly, and take a measurement.
Then place the sample cup to be analyzed into the sample cup holder, shut the hinged top
firmly, and initiate a measurement. See Figure 18 for an example of placing sample cups into
the Sample Cup Holder.
Place the sample bag into the test stand, shut the hinged top firmly, and take the
measurement.
When the hood is lifted, your analyzer will display a red icon at the top of the touch screen, as
in Figure 152.
When the hood is shut, your analyzer will display a green icon at the top of the touch screen,
as in Figure 153.
Your analyzer cannot make an analysis unless the icon is green. To remove your analyzer from
the Mobile Test Stand, simply squeeze the tabs on either side of the cone and pull the analyzer
down, gently but firmly, until it separates from the cone.
The Field Mate has a pivoting top plate which covers a shielded cavity inside which the
sample is placed. There are two insets which fit inside the cavity, optimized for the two most
common sample types - bagged bulk samples and sample cups.
There is a brace to support the analyzer on the underside of the base. Gently but firmly
unsnap the brace by pushing down on the protruding black tab. The brace will pivot out and
down 180 degrees. The brace helps compensate for the off-center weight of the analyzer when
clipped into the pivoting top plate.
To place your analyzer into the Field Mate, fit your analyzer's nose firmly up into the mating
cone until the clips snap solidly into place on both sides of the analyzer’s front end.
The top plate is opened by pressing the large orange latch on the base. Inserts can be placed
into the cavity, and samples can be placed into the inserts. The insert for bagged bulk samples
(shown in Figure 156) has a spring loaded inner plate which pushes the sample up to contact
the underside of the top plate for proper analysis. The insert for sample cups (shown in
Figure 157) holds the cup at the proper height for analysis with a soft foam support. Tilt the
top plate back until it latches with an audible click for analysis. This can be done with or
without your analyzer clipped into the cone.
The top plate can also be completely removed by pressing the two small hinge tabs together
towards the middle of the plate and lifting the plate from the base. This can be done with or
without your analyzer mounted into the cone.
Figure 158. The Top Plate Removed from the Base Unit
Removed from the base, the plate can be placed into the Test Guard, for soil screening in-situ.
To insert the plate into the Test Guard, slide the front tab on the plate under the tab of the
Test Guard.
Figure 160. Inserting the Top Plate into the Test Guard
The interlocking tabs secure the front of the Test Guard to the plate.
Squeezing the Hinge Tabs together, lower the back of the plate into the Test Guard. Release
the tabs so the Hinge Tab Pins protrude through the holes on either side of the Test Guard.
This secures the rear of the plate to the Test Guard.
At this point, you can fit your analyzer into the cone. It may now be used for soil screening.
Figure 163. The Top Plate Removed from the Base Unit
When the top plate is lifted, your analyzer will display a red icon at the top of the touch
screen, as in Figure 164.
When the plate is shut, your analyzer will display a green icon at the top of the touch screen,
as in Figure 165.
To remove your analyzer from the Field Mate Test Stand, simply squeeze the tabs on either
side of the cone and pull the analyzer up, gently but firmly, until it separates from the cone.
Press the trigger to initiate the reading, and disengage the Proximity Button by pressing the
Latch and opening the plate to stop the reading. With this feature enabled, you do not need
to hold the trigger down while the sample is being analyzed.
You can always use standard operation, as well as remote operation via NDTr.