BS2010
BS2010
BS2010
Lecture 1: Introduction 1. Collimated Light: Lecture 2: Bright-Field Microscopy Lecture 3: Wide-Field Microscopy Wide-field microscopy and bright-field microscopy are two different ima
○ Collimated light refers to a set of light rays that are parallel to each techniques used in microscopy. Here's a comparison of the two:
1. Brief History of Light Microscopy Objective Lens (Objective) Bright-Field Microscopy:
other. In other words, they are not converging or diverging as they
Magnifying Glass: Earliest evidence from 424 BC, inspired by water droplets. travel. This property is important in optical systems where parallel • Definition: The eye of a microscope, responsible for the primary image formation. 1. Principle: Bright-field microscopy is the most basic form of micro
Eyeglasses: Invented around 1300 AD, significant for later microscope development. rays are needed for proper focusing or to maintain a consistent • Features: Anti-reflective coatings, various specifications including magnification, where a specimen is illuminated by a light source, and the light th
First Compound Microscope: By Hans and Zacharias Jansen in 1595. intensity over a given area. aberration correction, infinity-corrected system, cover glass thickness, working absorbed by the specimen is collected by the objective lens to for
○ Collimation is typically achieved using devices like collimating distance (WD), and immersion medium. image.
lenses or mirrors, which adjust the light rays to be parallel. • Mounting: Objectives are mounted on a nosepiece. 2. Contrast: It relies on the absorption of light by the specimen to cr
○ Lasers are often described as collimated because they naturally contrast. Denser or thicker areas of the specimen absorb more lig
produce light that is both intense and parallel over considerable appear darker.
distances. 3. Resolution: It typically has a lower resolution compared to other
advanced microscopy techniques.
4. Setup: It is relatively simple and does not require specialized equ
or dyes.
5. Applications: It is widely used for observing transparent or opaq
○ samples, such as cells, tissues, and microorganisms.
6. Limitations: The main limitation is that living cells or unstained sa
may not provide enough contrast to be clearly visualized.
(exception:phase contrast microscopy)
2. Coherent Light:
2. Pioneers in Microscopy ○ Coherence refers to the phase relationship between different waves Wide-Field Fluorescence Microscopy:
Robert Hooke (1635-1703): Birth of cell biology, inventor of the compound microscope, author of of light. Coherent light has a constant phase difference between 1. Principle: In wide-field fluorescence microscopy, the specimen is
"Micrographia." waves, which means they can interfere constructively or with fluorescent dyes or tagged with fluorescent proteins. An exci
destructively when they overlap. light of a specific wavelength is used to illuminate the specimen, c
Antonie van Leeuwenhoek (1632–1723): Known for simple microscope and discoveries of protists, 1. Introduction to Fluorescence Microscopy
○ A light source is considered coherent if it has a single, well -defined the fluorophores to emit light at a different wavelength.
bacteria, and sperm. frequency (or a very narrow range of frequencies) and if the light Image Quality and Magnification • Fluorescence is a process where a substance absorbs light and re-emits it at a different wavelength. 2. Contrast: It provides high contrast by using the specific emission
waves maintain a fixed phase relationship with each other. • Common Magnifications: 5X, 10X, 20X, 40X, 60X, 100X. • Key concepts: Photoluminescence, Stokes shift, Fluorophore, Phosphorescence. from fluorophores, which can be detected separately from the exc
3. Comparison of Microscopes ○ Lasers are a prime example of coherent light sources because they • Eyepiece (Ocular): Used in conjunction with the objective to provide total a. Photoluminescence: light.
Hooke's Compound Microscope: Plagued by chromatic and spherical aberrations. emit light that is not only monochromatic (single wavelength) but magnification. ▪ Photoluminescence is a collective term that describes the phenomenon where a substance absorbs 3. Resolution: Offers higher resolution and can be enhanced by us
1. Chromatic Aberration: also has a fixed phase relationship across the entire beam. • Total Magnification: Calculated as Objective Magnification × Eyepiece light (photons) and then re-emits the light. This process involves the absorption of energy from a light objectives with higher numerical apertures (NA).
○ Chromatic aberration occurs when different wavelengths (colors) of light are focused at Magnification. source, which excites the electrons in the substance to a higher energy state, followed by the release 4. Setup: Requires specialized filters (excitation and emission filters
different points along the optical axis. This is due to the dispersion of light, where the Relationship Between Collimation and Coherence: • Useful Total Magnification: Limited by the numerical aperture (NA), often around of this energy as light as the electrons return to their original (ground) state. light source that provides the specific excitation wavelength need
refractive index of the lens material varies with the wavelength of light passing through it. • While all lasers produce coherent light, not all coherent light sources are 500 × NA to avoid empty magnification. b. Stokes Shift: the fluorophores.
○ Types: collimated. For example, some light -emitting diodes (LEDs) can emit light ▪ The Stokes shift refers to the difference in wavelength between the absorbed light and the emitted 5. Applications: Ideal for studying cellular processes, co-localizatio
that is coherent over a small area but is not collimated. In the context of microscopy, there are two key terms related to magnification: light in a photoluminescent process. Typically, the emitted light has a longer wavelength (and thus proteins, gene expression, and other applications requiring specif
▪ Axial Chromatic Aberration: Longitudinal shift of focus for different colors, resulting in total magnification and useful magnification. Here's the distinction between the lower energy) than the absorbed light. This shift occurs because some of the energy absorbed is lost molecular labeling.
• Conversely, light can be collimated without being coherent. For instance,
a blur along the axis. two: in the form of heat or through other non-radiative processes before the light is re-emitted. 6. Advantages: It allows for the detection of specific biomolecules,
sunlight can be made collimated using a lens or mirror, but it is not
▪ Lateral Chromatic Aberration: Transverse shift, where different colors come into coherent because the light waves do not have a fixed phase relationship. Total Magnification c. Fluorophore (Fluorescent Molecule): color imaging, and can be used for live-cell imaging with minimal
focus at different lateral positions. In summary, collimated light has parallel rays, and coherent light has a fixed • Definition: The overall magnification achieved by a microscope, which is the ▪ A fluorophore is a specific type of chromophore that can undergo fluorescence. It is a molecule that phototoxicity.
○ Effects: Chromatic aberration can lead to colored fringes around objects in the image and a phase relationship between its waves. A laser can be both collimated and product of the magnification of the objective lens and the eyepiece (ocular can absorb light at a certain wavelength and then re-emit it at a longer wavelength. Fluorophores are 7. Limitations: The need for fluorescent labeling can be a limitation
reduction in the overall sharpness and clarity of the image. coherent, but these terms describe distinct optical properties, and one does not lens). used extensively as markers or labels in various scientific applications, including fluorescence may require genetic manipulation or the use of antibodies. Additio
○ Correction: This aberration is often corrected by using a combination of lenses made from necessarily imply the other. • Calculation: Total Magnification = Objective Magnification × Eyepiece microscopy, to visualize and track specific structures or molecules within a sample. photobleaching can be an issue where the fluorophores lose their
different types of glass with varying dispersion properties, a technique known asachromatic Magnification. d. Phosphorescence: to fluoresce over time.
or apochromatic lens design. Incoherence • Example: If you have an objective with a 60X magnification and an eyepiece ▪ Phosphorescence is a type of photoluminescence where the emitted light is delayed after the In summary, while both techniques allow for the visualization of microsc
Incoherent light is light in which the phase relationship between different light with a 10X magnification, the total magnification would be 600X (60X × 10X). cessation of the excitation light. Unlike fluorescence, which lasts only nanoseconds, phosphorescence samples, wide-field fluorescence microscopy offers higher contrast and
can last for minutes or even hours. This extended emission is due to a different mechanism involving specificity through molecular labeling, whereas bright-field microscopy i
waves is random or constantly changing. This means that the light waves do not Useful Magnification the transition of electrons to a 'triplet state,' which is longer-lived than the 'singlet state' involved in more straightforward technique that relies on the absorption of light by t
have a fixed or consistent phase difference, and as a result, they do not exhibit the
• Definition: The magnification at which the microscope can effectively fluorescence. As a result, phosphorescent materials can store energy for a longer period before sample to create an image.
phenomenon of constructive or destructive interference over a significant period
resolve details in a specimen. Beyond this point, increasing magnification releasing it as light.
or distance. Here are some key points about incoherent light: does not improve the clarity or resolution of the image; it may actually make CCD stands for Charge-Coupled Device. It is a type of semiconductor d
1. Phase Variation: In incoherent light, the phase of the light waves varies it more blurry or less detailed. that is widely used in various applications for capturing and processing
randomly, leading to a lack of a stable interference pattern. • Limitation: Useful magnification is limited by the numerical aperture (NA) of images, as well as for detecting and measuring light intensities. Here's
2. Spectrum: Incoherent light sources often have a broad spectrum, meaning the microscope's objective lens and the properties of the light used (its detailed explanation of what a CCD is and its functions:
○ they emit light over a range of wavelengths. However, it's important to note wavelength, λ). 1. Image Capture:
that having a broad spectrum does not necessarily mean the light is • Calculation: A common rule of thumb is that the useful total magnification is ○ A CCD is used in digital cameras, including those found in
incoherent; it's the lack of a fixed phase relationship that defines about 500 times the numerical aperture (500 × NA). This is a general microscopes, to capture images. It functions as an electron
incoherence. guideline and can vary depending on the specific microscope and the quality detector, converting light that passes through the camera le
3. Natural Light: Most natural light sources, such as the sun or a light bulb, of its components. electrical charges.
emit incoherent light because they emit light from many different atoms or • Example: If the NA of an objective lens is 1.4, the useful magnification 2. Pixel Array:
molecules that are not phase-locked. would be around 500 × 1.4 = 700X. ○ The CCD consists of an array of light-sensitive elements ca
4. Speckle Pattern: When incoherent light is reflected or scattered off a Key Differences pixels. Each pixel corresponds to a point in the captured im
rough surface, it produces a speckled pattern, which is a result of the accumulates a charge proportional to the light intensity that
1. Purpose: Total magnification is a straightforward multiplication of the
random interference of the light waves. receives.
magnifications of the objective and eyepiece, while useful magnification is
5. Applications: Incoherent light is used in many everyday applications about the practical limit of resolving power for a given microscope setup. 3. Charge Transfer:
because it is generally easier to produce and handle. For instance, ○ The key feature of a CCD is its ability to transfer the electric
2. Resolution: Total magnification doesn't consider the resolving power of the
incandescent bulbs, fluorescent lamps, and LEDs are all incoherent light charge generated in each pixel in a sequential manner to a
microscope, whereas useful magnification is directly related to the resolving
sources. pixels. This is how the device gets its name, as it "couples"
power determined by the NA and the wavelength of light.
6. Microscopy: In microscopy, incoherent light is often preferred because it together the charges from each pixel.
3. Image Quality: Beyond the useful magnification, the image quality
provides a more even illumination of the sample, reducing the risk of 4. Readout Register:
deteriorates because the microscope cannot resolve finer details, resulting
○ damaging the sample and producing a clearer image. in a blurred or less distinct image even though the total magnification is ○ The accumulated charges are eventually transferred to a re
register, where they are converted into a voltage signal. Th
7. Interference and Diffraction: Incoherent light does not produce sharp higher.
is then processed and digitized to create the final image.
interference patterns or diffraction effects, which can be an advantage in
5. High Sensitivity and Low Noise:
applications where a uniform light distribution is desired. Optical Aberrations 2. Molecular Explanation of Fluorescence ○ CCDs are known for their high sensitivity to light and low no
8. Thermal Light: Light emitted by a hot object, such as a glowing filament • Types: Chromatic, spherical, and field curvature aberrations. levels, making them ideal for capturing clear and detailed im
in an incandescent light bulb, is typically incoherent. This is because the • Involves electron transition from ground to excited state within a molecule.
• Corrections: Different objective types correct for these aberrations to varying • Quantum yield and Stokes shift are critical factors. even in low-light conditions.
light is emitted by a large number of atoms or molecules that are not degrees (Achromat, Plan Achromat, Fluorite, Plan Fluorite, Plan Apochromat). 6. Resolution:
synchronized. • Photobleaching is a concern where fluorophores lose their ability to fluoresce.
○ The resolution of a CCD is determined by the number of pix
In summary, incoherent light is characterized by the random phase relationship
between its constituent waves, which leads to a lack of stable interference effects. Field Curvature the array. More pixels allow for higher resolution images, ca
2. Spherical Aberration: • Definition: Field curvature is an optical aberration where the image plane is more details of the scene.
This property makes incoherent light suitable for many applications where a
○ Spherical aberration is an optical defect that occurs when light rays passing through a not flat but curved. This means that a flat object in the field of view will not 7. Monochromatic vs. Color CCDs:
uniform and non-damaging light source is needed.
be in perfect focus at all points simultaneously. Instead, the best focus is ○ CCDs can be monochromatic (black and white) or color. Co
spherical lens near its edge are not focused as sharply as those passing through the central
typically achieved at the center of the field, with the focus deteriorating as CCDs use a color filter array (CFA), typically a Bayer filter p
region. This is because spherical lenses bend light paths differently depending on the to capture color information, which is then processed to cre
you move towards the edges.
distance from the optical axis. full-color image.
• Effect on Image: Due to field curvature, objects at the edges of the field
○ Effects: The result is a blurred image with a hazy, out-of-focus appearance, particularly in may appear blurred or distorted, even when the center is in sharp focus. 8. Applications:
the periphery of the field of view. This can be particularly problematic when observing large specimens or ○ Beyond digital cameras and microscopes, CCDs are used i
○ Correction: Spherical aberration can be reduced by using lenses with an aspherical shape, when high-quality imaging across the entire field of view is required. range of applications, including astronomy, where they are
which are designed to correct the path of light rays and bring them to a common focus. In • Correction: High-quality objectives often have corrected or reduced field telescopes to capture images of celestial objects, and in sc
some high-quality microscopes, special lens designs and coatings are used to minimize these curvature to ensure that the entire field of view is in focus, or at least that a research for detecting and measuring light in various conte
aberrations. larger, flatter area is in focus. This is especially important in applications like 9. Advantages:
photography or digital scanning of microscopy slides. ○ CCDs offer several advantages, such as high quantum effic
good linearity, and excellent image quality. They are also c
capturing images with a wide dynamic range.
10. Limitations:
○ One limitation of CCDs is that they can be more expensive
alternative image sensors, such as CMOS (Complementary
Oxide-Semiconductor) sensors. Additionally, CCDs require
power to operate and can be slower in readout times comp
○ CMOS sensors.
In summary, a CCD is a sophisticated light detection device that is cent
the operation of many digital imaging systems. Its ability to capture light
electrical charges and transfer these charges to create a high-quality im
makes it a critical component in various imaging applications, including
microscopy.
Leeuwenhoek's Simple Microscope: Superior in the 17th century due to technological limitations.
4. Localization Precision: In both PALM and STORM, the position of individual molecules is determined with
nanometer precision, often by fitting the point spread function (PSF) to a Gaussian function or by calculating the
center of mass of the fluorescence signal. Image Segmentation by Thresholding
5. Applications: Super-resolution microscopy has a wide range of applications in cell biology, including the study of • Process: Selecting certain pixels (ROI) based on intensity thre
protein interactions, cellular organelle dynamics, and the structure of biological macromolecules. • Outcomes: Object (≥ threshold) and background (< threshold) pix
6. Technical Requirements: These techniques often require specialized equipment, software for image analysis, and • Binary Image/ROI: Generated for morphometric analysis.
sophisticated sample preparation methods.
Morphometry is a branch of biology that involves the measurem
Single Molecule Localization Techniques biological structures. In the context of image analysis and micro
• Gaussian Fitting: Fitting the Airy disk (2D) or PSF (3D) with a 2D Gaussian function to determine the position of a single to the quantitative measurement and analysis of the geometric
An image histogram is a graphical representation that shows the distribution of pixel image. This can include cells, tissues, or other biological entitie
molecule at nanometer resolution. intensities or pixel values in an image. The purpose of an image histogram includes:
• Center of Mass: Using the concept of center of mass to locate the precise position of a single molecule, where the PSF Here are some key aspects of morphometric analysis:
1. Visualizing Intensity Distribution: It provides a visual summary of the spread of 1. Measurement of Length, Area, and Volume: Morphom
is considered as a physical object with mass equivalent to its fluorescence intensity. pixel intensities, showing how many pixels correspond to each possible intensity measuring the dimensions of structures. For instance, in
value. length of cells, the area of a tissue section, or the volume
Precision in Single Molecule Localization 2. Assessing Image Contrast: By looking at the histogram, one can quickly 2. Pixel-Based Calculations: In digital images, morphome
• Precision Representation: The precision is denoted by σ (standard deviation), which approaches zero as the number of determine if an image has high contrast (a histogram that spreads across the entire by counting pixels. The size of each pixel in real-world un
photons ( n) increases. range of possible values) or low contrast (a histogram that is bunched up in a small and this calibration factor is used to convert pixel measur
• Analogy: Provided a shooting game analogy to explain why more photons result in higher precision. range). dimensions.
3. Identifying Overexposure and Underexposure: A histogram can indicate if parts 3. Image Segmentation: Before morphometric analysis, im
of the image are overexposed (too much white, indicated by a spike at the right where the region of interest (ROI) is isolated from the res
side) or underexposed (too much black, indicated by a spike at the left side). using various techniques, including thresholding, which s
4. Adjusting Image Settings: Photographers and image editors use histograms to on intensity differences.
adjust settings such as exposure, brightness, and contrast to achieve the desired 4. Feature Extraction: Morphometric analysis may involve
look for an image. features, such as the number of cells, the size and shape
5. Diagnosing Image Quality: In scientific imaging, histograms can help identify if organelles within a cell.
the image has been correctly exposed or if there is a loss of detail due to saturation 5. Statistical Analysis: The data obtained from morphome
or extreme dark areas. to statistical analysis to understand the variability, distribu
6. Post-Processing Guidance: During post-processing, histograms are used to different morphological parameters.
guide adjustments that can optimize the image's appearance and ensure that the
Advanced Super-resolution Techniques full dynamic range of the image is utilized.
6. Applications: Morphometric analysis is used in various f
• PALM and STORM: Utilize special fluorophores that can be turned on by an activation beam and localized with high ○ Cell Biology: To study cell growth, differentiation,
7. Machine Learning and Analysis: In computer vision and image analysis,
accuracy, followed by photobleaching. ○ Anatomy: For measuring the dimensions of organs
histograms can be used to extract features from images, which can be useful for
tasks like object recognition or image classification. ○ Pathology: To diagnose diseases based on the siz
What is FIONA? 8. Non-Destructive Editing: Since looking at a histogram doesn't alter the image, it's ○ Ecology and Taxonomy: To measure the physica
species identification and ecological studies.
Confocal Microscope FIONA, which stands for Fluorescence Imaging with One Nanometer Accuracy, is a super-resolution microscopy technique that a non-destructive way to evaluate and plan the editing process.
7. Tools and Software: There are numerous software tools
is part of a family of methods known as localization microscopy. Like PALM and STORM, FIONA also relies on the precise 9. Consistency in Batch Processing: When processing multiple images, histograms
localization of single fluorescent molecules to achieve a resolution far beyond the diffraction limit of light. can help ensure consistency across various images by targeting similar intensity that can automate or assist with measurements, providin
Here's a general overview of how FIONA works: distributions.
1. Sample Preparation: The biological sample is labeled with high-affinity fluorescent dyes or proteins that can be Visual Explanation of Segmentation
10. Technical Assessment: In professional settings, such as in print or film
specifically bound or fused to the target molecules within the cell. production, histograms are used to make sure that images meet technical
• 2D Intensity Plot: Intensity as the Z-axis, with bright regions a
2. Immobilization: The sample is typically fixed and immobilized on a slide to prevent movement during imaging. standards for brightness and contrast. threshold as background.
3. Activation and Localization: The fluorescent molecules are activated one by one, either stochastically or through a
controlled process, and their precise location is determined. This is often achieved by exciting the fluorophores with a Determining whether a histogram is "good" in the context of imaging depends on the
laser and then using advanced algorithms to determine the center of the point spread function (PSF) for each activated specific goals and requirements of the image. However, there are some general
molecule. guidelines and characteristics that can indicate a well-exposed and well-distributed
4. High Precision: FIONA aims to achieve localization precision of about one nanometer, which is significantly better than histogram:
the typical 20-30 nanometer resolution achieved by conventional light microscopy. 1. Full Utilization of Dynamic Range: A good histogram will spread across the
5. Data Accumulation: The process is repeated thousands or millions of times, accumulating a large dataset of molecular majority of the dynamic range without bunching up at either end. This suggests that
positions. the image has a good balance of highlights and shadows, and it is using the full
6. Image Reconstruction: The collected data is used to reconstruct a super-resolved image of the sample, where each tonal range available.
point of light represents the precise location of an individual fluorescent molecule. 2. No Clipping: There should be no spikes touching the very right or left edges of the
7. 3D Imaging: FIONA can also be extended to three dimensions by acquiring images at different focal planes and histogram, which would indicate that highlights or shadows are "clipped" or
computationally reconstructing the 3D distribution of the labeled molecules. overexposed/underexposed. Clipped areas can result in loss of detail.
8. Applications: FIONA is useful for studying molecular interactions, protein complexes, and the organization of cellular 3. Balanced Distribution: The graph should be relatively balanced, with a natural
structures with high spatial precision. distribution of tones from darks to lights. A skewed histogram might indicate an
image that is too dark or too light overall.
4. Detail in Shadows and Highlights: A good histogram will show detail in the
FIONA vs PALM shadow. ROI Generation and Manipulation
FIONA (Fluorescence Imaging with One Nanometer Accuracy) and PALM (Photoactivated Localization Microscopy) are both • ROI (Region of Interest): Specific areas of interest within an im
super-resolution microscopy techniques that rely on the precise localization of individual fluorescent molecules. However, there 6. Maximizing Dynamic Range Utilization
• Utilize the dynamic range efficiently by adjusting parameters so that the maximum image • Methods: Image segmentation or manual drawing.
are some differences in their approaches and applications:
value is about 50%-70% of the dynamic range. • Binary Image: Commonly uses values 0 and 255.
1. Fluorophore Activation:
○ PALM: Uses photoactivatable fluorescent proteins (PA-FPs) or dyes that can be switched from a non-fluorescent to Analysis of Nuclei by Hoechst 33342
a fluorescent state by light. 7. Types of Image Acquisition
• Time lapse • Fluorescence: Hoechst 33342 labels DNA, emitting blue fluore
○ FIONA: Typically employs organic dyes that are directly excited to a higher energy state from where they can emit
light. The specific method of fluorophore activation can vary, but it does not necessarily rely on photoactivation as in • Monochrome • Intensity Contribution: Nucleus intensity from chromatin DNA
PALM. • Multi-color • Background Subtraction: Subtracting constant background in
2. Localization Precision: • 3D (multi-z sections)
○ PALM: Aims to achieve nanometer-scale precision in localizing individual molecules, but the typical precision is • Single-color 3D Quantification in Nuclei Analysis
around 10-20 nanometers. • Multi-color 3D time lapse • Quantifiable Features:
○ FIONA: As the name suggests, it aims for even higher precision, with the goal of achieving near 1-nanometer ○ Number of nuclei (cell count).
accuracy in localization. 8. Monochrome and Multi-color Imaging ○ Area of each nucleus (nucleus size).
3. Imaging Process: • Monochrome: Single channel, example: ER in BSC1 cells expressing GFP-Sec61β. ○ Mean intensity of each nucleus (chromatin density).
○ PALM: Involves a cycle of activating a subset of fluorophores, localizing them, and then photobleaching them so • Multi-color: Multiple channels, each with its own color, example: Golgi complex and ER ○ Total intensity of each nucleus (total chromatin content).
they do not interfere with subsequent rounds of imaging. exit sites in BSC1 cells. Merging the image later.
○ FIONA: The specifics can vary, but it generally involves a similar cycle of activation, localization, and inactivation of GLIM: Golgi Localization Method
fluorophores. The key difference lies in the potential for higher precision and possibly different strategies for 9. 3D Imaging • Nocodazole Treatment: Induces Golgi mini-stacks formation.
achieving this. • Involves capturing images at different depths (z-stacks) with a consistent z-step distance. • Localization Quotient (LQ): A measure used to calculate Gol
4. Data Analysis: • XY,XZ,YZ • Criteria for Selection:
○ PALM: Uses algorithms to determine the center of the PSF for each molecule, which is then used to reconstruct the ○ Signal-to-noise ratio ≥ 30.
super-resolved image. 10. Time Lapse Imaging ○ Axial angle criteria: d1 ≥ 70nm.
○ FIONA: May employ more advanced or different algorithms to achieve the claimed 1-nanometer accuracy. The • Captures images at regular time intervals, requires live cell imaging, and maintaining a
analysis techniques are critical to extracting the highest possible resolution from the data. ○ Co-linear criteria: /tanα/ or /tanβ/ ≤ 0.3.
constant time interval.
5. Applications:
High-Resolution Golgi Protein Mapping
○ Both techniques are used to study the nanoscale organization of cellular structures and molecular interactions.
However, the higher precision achievable with FIONA could potentially offer even more detailed insights into • Resolution: ~30 nm, allowing for densely packed Golgi protein
Time-lapse imaging is a photographic technique used to capture the appearance or
biological processes. • Kinetics: Quantitative analysis of RUSH system reporters and
behavior of subjects over a period of time, where a series of images are taken at regular
6. Technical Requirements: intervals. Here's how it generally works and how photobleaching can affect it:
○ PALM: Requires specific photoactivatable fluorophores and a setup capable of precise control over the activation Intra-Golgi Trafficking Model
How Time-Lapse Imaging Works:
and imaging light. • Golgi Complex: Linear structure with cargos entering at the ci
1. Intervalometer: A device or software feature that triggers the camera at set time
○ FIONA: May require more sophisticated imaging and analysis techniques to achieve the higher precision, including intervals to capture images automatically. • Secretory Cargos: Exit before reaching the TGN, with TGN ta
potentially more advanced microscopy hardware and software. 2. Consistent Conditions: The subject and camera settings (focus, exposure, white
balance, etc.) are kept consistent between shots to ensure continuity.
F-techniques in Microscopy 3. Capture: The camera takes a series of individual images over a set period, which
• FRAP (Fluorescence Recovery After Photobleaching): Used for live cell imaging to measure the kinetics of molecular can range from seconds to hours or even days.
diffusion and to determine the continuity of cellular structures like the ER (Endoplasmic Reticulum). 4. Stacking: The individual images are then stacked or compiled to create an
animation or video, giving the illusion of time being compressed.
5. Software: Specialized software is often used to process the sequence of images
Principle: Removes out-of-focus light using point illumination and point collection. into a time-lapse video.
Scanning Mechanism: Point-by-point scanning, with pixel dwelling time in microseconds.
6. Live Cell Imaging: In scientific applications, such as cell biology, time-lapse
Features of the Confocal Microscope microscopy is used to observe cellular processes over time without disturbing the
• Epi-fluorescence illumination with a laser. cells.
• Pinhole for confocality, with a trade-off between light collection and confocal quality. Photobleaching and Its Effects on Time-Lapse Imaging:
• Photomultiplier tube (PMT) for detection, which amplifies signals but lacks spatial resolution. 1. Photobleaching: This is a process where the fluorescence from a specimen fades
Image Acquisition in a Confocal Microscope over time due to repeated exposure to light, particularly from the illumination
• Pixel dwelling time is controlled by scanning speed. source used in fluorescence microscopy.
• Resolution can be adjusted through zoom levels (e.g., 128x128, 256x256, 512x512). 2. Fluorescent Molecules: Photobleaching primarily affects samples that use
• Manipulation of pinhole size and laser power affects image quality. • fluorescent markers or dyes, as these molecules are more susceptible to light-
induced degradation.
PMT 3. Impact on Time-Lapse: In time-lapse imaging, especially in fluorescence
A Photomultiplier Tube (PMT) is a highly sensitive electronic device that is used to detect and microscopy, photobleaching can reduce the fluorescence signal over time, leading
amplify light signals, particularly in low-light conditions. It is commonly used in various scientific to a decrease in image quality and potentially affecting the scientific outcome of the
applications, including fluorescence microscopy, as well as in medical and industrial settings. study.
Here's how a PMT works and its key features: 4. Mitigation Strategies:
○ Minimize Light Exposure: Use the lowest light intensity necessary to obtain
Working Principle: a good image.
1. Photoelectric Effect: When light (photons) strikes the photocathode, which is a light - ○ Optimize Exposure Time: Shorter exposure times reduce the chance of
sensitive material, it can eject electrons through the photoelectric effect. photobleaching.
2. Amplification: The emitted electrons are accelerated towards a series of electrodes ○ Use Anti-Fade Reagents: These can help to protect the fluorescent
Mitotic ER is continuous *It discusses the default exit of secretory cargos before reaching t
called dynodes by an electric field. Each dynode is at a higher voltage than the previous molecules from degradation.
one. • FLIP (Fluorescence Loss In Photobleaching): Compared to FRAP, used to demonstrate the continuous connection of to the TGN being signal-dependent, which adds a layer of regulat
cellular structures. ○ Cooling the Specimen: Lower temperatures can slow down the rate of
3. Secondary Electron Emission : As the electrons strike each dynode, they cause the photobleaching. *The use of molecular markers (like VSVG, E -cadherin, TNFα, C
emission of multiple secondary electrons. This process is repeated at each dynode, ○ Use of Photo-Stable Dyes: Some fluorescent dyes are more resistant to protein trafficking within the Golgi could be a focus of the lecture
leading to a cascade effect that significantly amplifies the original signal. Dynode become photobleaching than others. different cargo molecules are sorted and transported.
more and more positively charged. 5. Non-Fluorescent Samples: If the time-lapse imaging does not involve
4. Anode Collection: The final stage collects the amplified electrons at the anode, which is Epilogue
fluorescence, photobleaching is not a concern.
gi protein localization. •
n visualization.
d intra-Golgi trafficking.
s-Golgi and exiting at the TGN. • Blur - primary effect of image blur is to reduce the contrast and visibility of small objects or detail.
argeting being signal-dependent. • Noise – random variation of brightness/ color information in images
• Artifact - can create image features that do not represent a body structure
• Staining: Enhancing contrast with heavy metals like osmium, which reacts with unsaturated fatty acids and • Distortion - inaccurate impression of their size, shape, and relative positions.
crosslinks membrane lipids.
• Sensitivity and Specificity: ability to show a condition as positive or negative.
• ROC Curve: relationship between sensitivity and specificity.
5. BOLD in fMRI:
• Primary form of fMRI contrast.
• Maps neural activity by imaging changes in blood flow related to brain cell energy use.
• Deoxyhemoglobin is paramagnetic, affecting the magnetic susceptibility of blood.
• Sensitive to venous changes
9. Advantages of BOLD:
• Non-invasive, high spatial and temporal resolution, enables observation of entire brain networks.
8. Geometrical Optics
Refraction: Law of refraction and the concept of the refractive index.
The law of refraction, also known as Snell's Law, and the concept of the refractive index are fundamental
to the understanding of how light behaves when it passes from one medium to another , such as from air
into water or glass. Here's a detailed explanation of both:
Law of Refraction (Snell's Law)
1. Description: The law of refraction states that when a light wave passes from one medium to
another, there is a change in its speed, resulting in a change in direction (refraction) unless the
light is incident perpendicularly (normal incidence) to the boundary between the two media.
2. Mathematical Expression: The law is mathematically expressed as: 1sin( 1)= 2sin( 2)n1sin(θ1
)=n2sin(θ2)where:
○ 1n1 and 2n2 are the refractive indices of the first and second media, respectively.
○ 1θ1 is the angle of incidence (the angle between the incident ray and the normal to the
surface).
○ 2θ2 is the angle of refraction (the angle between the refracted ray and the normal).
3. Consequences: When light passes from a medium with a lower refractive index to one with a
higher refractive index, it slows down and bends towards the normal. Conversely, when it passes Objective Specifications
from a medium with a higher refractive index to one with a lower refractive index, it speeds up • Additional Specs: Tube length, working distance (WD), cover glass thickness,
and bends away from the normal. and immersion medium.
Refractive Index • Usage Caution: Never apply oil to a dry lens.
1. Definition: The refractive index (denoted as n) of a medium is a measure of how much the speed
of light is reduced inside the medium as compared to the speed of light in a vacuum (denoted as Numerical Aperture of Condenser
c). • Role: Controls the illumination NA, which affects image brightness and resolution.
2. Relationship with Speed of Light: The refractive index is defined by the ratio of the speed of light • Adjustment: Through the condenser aperture.
in a vacuum to the speed of light in the medium: = where cis the speed of light in the medium.
3. Physical Meaning: A higher refractive index indicates that light travels more slowly in the medium Bright-Field Microscopy
and is more "refracted" or bent when entering it. • Process: Uses bright light to illuminate samples, primarily relying on light
4. Values: The refractive index is a dimensionless number. For example: absorption for visualization.
○ Air: approximately 1.00 • Applications: Common in histological sample staining.
○ Water: approximately 1.33 • Sample Preparation: Involves fixation, embedding and sectioning, and staining.
○ Glycerol: approximately 1.47 1. Fixation:
○ Immersion Oil: approximately 1.52 ○ Purpose: To immobilize, kill, and preserve the cells.
○ Glass: approximately 1.52 ○ Method: Fixatives such as formaldehyde are commonly used to
5. Dispersion: The refractive index often varies with the wavelength of light, leading to dispersion, prevent decay and maintain the structure of the cells.
where different colors (wavelengths) of light are refracted by different amounts, causing them to 2. Embedding and Sectioning:
spread out. ○ Dehydration: After fixation, the tissue is dehydrated to remove water,
which prepares it for embedding.
6. Applications: The concept of the refractive index is crucial in various fields, including optics, where ○ Embedding: The dehydrated tissue is embedded in a supporting
lenses are designed to focus light by bending the rays according to their refractive indices. medium, typically hot liquid wax or resin.
In summary, the law of refraction describes how light changes direction when passing from one medium ○ Solidification: The embedding medium solidifies, providing mechanical
to another, and the refractive index is a measure of how much the speed of light is reduced in a strength to the tissue.
medium, which directly affects the degree of refraction. Together, these concepts explain the behavior ○ Sectioning: The embedded tissue is cut into thin sections (5 - 15 µm 1. Signal:
of light in different optical environments and are essential for designing optical instruments and thick) using a microtome, a precision instrument designed for slicing ▪ In microscopy, the signal is the light emitted or scattered by the specimen that is being imaged,
biological specimens. particularly when it has been labeled with a fluorescent dye or is naturally fluorescent. For instance, in
understanding vision and various physical phenomena. fluorescence microscopy, the signal is the specific light emitted by the fluorophores that are bound to
3. Staining:
○ Purpose: To reveal cellular and subcellular structures by contrasting the target molecules within the sample.
different components within the cells. 2. Noise:
○ Rehydration: The sections are rehydrated to prepare them for staining. ▪ Noise is the random variation of the image intensity that is not related to the signal. It can arise from
○ Staining Techniques: Various staining methods are used, including: various sources, such as electronic noise from the detector, stray light within the microscope,
▪ Hematoxylin and eosin (H&E) staining, which is the most autofluorescence from the sample itself, or even thermal fluctuations.
common technique in histology. Hematoxylin tends to stain cell
nuclei blue-black, while eosin stains the cytoplasm and cell A high signal-to-noise ratio means that the signal is much stronger than the noise, making it easier to detect and
membranes pink. analyze the features of interest within the sample. This is particularly important in fluorescence microscopy, where
○ Mechanism: The molecular mechanism of staining is not entirely clear the signal (fluorescence) can be quite weak compared to the background noise.
but is thought to involve an affinity of certain dyes for specific cellular
components, such as negatively charged molecules for hematoxylin. 7. Wide-field Epi-Fluorescence Microscopy
4. Mounting: • Explains the difference between episcop and diascopic illumination.
○ The stained sections are mounted on a glass slide using an
appropriate mounting medium to secure them in place.
○ A cover slip is placed over the mounted section to protect it and
reduce the risk of damage or contamination.
5. Additional Considerations:
○ Cover Glass Thickness: The thickness of the cover glass (usually
0.17 mm) must be consistent to ensure even contact with the mounting
medium and to facilitate focusing.
○ Immersion Medium: If high magnification objectives are used, an
immersion medium such as oil may be required to improve image
quality by reducing spherical aberration. ○
Lens: Ideal lens, optical axis, focal points, and focal lengths. 6. Microscope Slide Preparation:
Real vs. Virtual Images: Differences, detection methods, and applications. ○ The slides are labeled with relevant information, such as the type of
Real and virtual images are two types of images formed by optical systems, such as lenses and mirrors. tissue, the date of preparation, and any special staining techniques
They have distinct properties and are used in various applications due to their unique advantages. used.
Real Images 7. Storage:
○ Prepared slides are stored in a controlled environment to prevent
Definition:
damage and maintain the quality of the specimen.
• A real image is formed when light rays actually converge to form an image. This happens when 8. Examination:
light rays emitted from an object are refracted by a lens or reflected by a mirror in such a way that ○ Once prepared, the slides are ready to be examined under the
they meet at a point. microscope, where the bright-field technique relies on the absorption
Characteristics: of certain wavelengths of light to visualize the structure of the stained Episcopic Illumination:
• Real images are inverted relative to the object. sample. ○ In episcopic illumination, also known as "epi-illumination," the light source is placed above the
• They can be projected onto a screen because the light rays physically intersect. objective lens, and the light is directed onto the specimen through the objective itself. This means that
Advantages: Phase Contrast Microscopy(A variation of bright field the objective lens acts as a condenser, focusing the light onto the specimen.
• Can be displayed on a screen for viewing by multiple people. ○ Episcopic illumination is particularly used in fluorescence microscopy, where the same objective that
microscopy) is used to focus the light for excitation of the fluorophores is also used to collect the emitted
• Can be captured permanently on photographic film or a digital sensor.
• Advantages: No need for cell staining, allows observation of live cells. fluorescence.
• Can be manipulated using additional optical components to alter size, orientation, or quality.
• Requirements: Special objectives and phase rings. The objective phase ring ○ The advantage of this method is that it allows for high-intensity illumination of the specimen, which is
Applications: must match the condenser phase ring. necessary for the detection of weakly fluorescent signals.
• Projectors use real images to display content on a screen. • Principle: Utilizes differences in refractive indices of cellular organelles to create ○ Episcopic illumination also helps to minimize the background light, as the objective can be designed to
• Cameras produce real images on film or a sensor to capture photographs. phase shifts, translating into intensity differences. transmit the fluorescence emission while blocking the excitation light.
• Telescopes and microscopes use real images for detailed observation and analysis. Diascopic Illumination:
○ Diascopic illumination, also known as "transmitted light" or "normal" illumination, involves a light
source that is placed below the specimen. The light passes through the specimen and is then
collected by the objective lens.
○ This is the traditional method of illumination used in bright-field microscopy, where the specimen is
visualized based on the absorption or scattering of light as it passes through the sample.
○ The advantage of diascopic illumination is that it provides a general view of the entire specimen and is
useful for observing the overall morphology and structure.
○ However, diascopic illumination is not ideal for fluorescence microscopybecause the emitted
fluorescence signal is weaker and can be overwhelmed by the strong excitation light if not properly
separated.
• Discusses the role of filters in reducing background noise.
2. Filter Turret:
9. Compound Microscope Systems
○ The filter turret is a rotating wheel or slide mechanism that holds multiple filter sets. Each filter
Finite vs. Infinity Systems: Differences, advantages, and applications. set typically consists of an excitation filter, a dichroic mirror, and an emission filter.
In the context of optical microscopy, finite and infinite systems refer to two different types of optical ○ By rotating the turret, different filter sets can be brought into the optical path to select the
configurations used in microscopes, particularly concerning the way light is managed between the appropriate wavelengths for exciting and detecting different fluorophores.
objective lens and the eyepiece or tube lens. ○ The turret allows for quick and easy switching between different fluorescence channels without
the need for manual filter changes, which can be time -consuming and may risk contamination.
3. Filter Cube:
○ The filter cube is a complete assembly that includes the excitation filter, the dichroic mirror, and
the emission filter, all housed together in a single unit.
○ The filter cube is designed to match the specific excitation and emission characteristics of a
particular fluorophore. It ensures that only the light necessary to excite the fluorophore passes
through the excitation filter, the emitted light is transmitted by the dichroic mirror, and the
unwanted light is blocked by the emission filter.
○ Using a filter cube with the correct specifications is crucial for obtaining a high -quality
fluorescence signal and minimizing background noise.
• High numerical aperture objective and alignment ease.
• Considerations for sample mounting and illumination control.
Finite System
1. Description:
○ A finite optical system is one in which the objective lens forms a real image at a specific
distance (finite distance) from the lens itself.
2. Components:
Gamma, also known as the contrast or tone curve, affects the display of an image by
defining how the intensity values in the image are mapped to the display output. It is a
measure of the overall "contrastiness" or the way in which the mid-tones in an image are
rendered. Here's how gamma impacts image display:
1. Linear vs. Non-Linear Response:
○ A gamma value of 1.0 represents a linear response, meaning that there is a
one-to-one correspondence between the input values and the output. This
results in a display that has the same contrast as the original image data.
○ Values greater than 1.0 produce a more contrasted or "crisp" image, with
darker darks and lighter lights, often referred to as being in the "gamma
correction" mode.
○ Values less than 1.0 result in a less contrasted or "flat" image, where the
differences between light and dark areas are less pronounced.
2. Human Perception of Brightness:
○ The human eye perceives light in a non-linear fashion, which is why many
display devices are calibrated with a gamma value of around 2.2. This
compensates for the eye's response and makes the image appear more
natural.
3. Display Devices:
○ Different display devices may have different native gamma values. For
example, older CRT monitors typically had a gamma of around 2.5, while
modern LCD monitors are often calibrated to a gamma of 2.2.
4. Image Processing:
○ Adjusting gamma can be used as a simple form of image processing to
improve the appearance of an image on a particular display device without
altering the actual pixel values in the image.
5. Digital Image Representation:
○ In digital image files, gamma correction is often applied during the encoding
process, and this information is stored in the file's metadata. When the image
is displayed or processed, the gamma value is taken into account to ensure
the image appears as intended.
6. Professional Settings:
○ In professional imaging and printing, accurate gamma correction is crucial for
ensuring that the image on the display matches the final printed result.
7. Software Tools:
○ Image editing software often includes gamma adjustment tools, allowing
users to modify the gamma curve for creative or technical purposes.
8. Night Vision and Low Light Conditions:
○ In some cases, such as night vision or low-light photography, a gamma
adjustment can help to bring out details in the darker parts of the image.
9. Compatibility Issues:
○ Incorrect gamma settings can lead to compatibility issues where an image
looks correct on one device but appears too dark or too light on another.
In summary, gamma is a critical factor in how an image is displayed. It influences the
perceived brightness and contrast of the image and must be properly managed to ensure
that images are displayed accurately across different devices and viewing conditions.
18. Conclusion
• Emphasizes the importance of proper image acquisition, manipulation, and storage to
maintain scientific integrity and data quality.
Horseradish Peroxidase(HRP)
Horseradish Peroxidase (HRP) is an enzyme that is widely used in molecular biology and histology for a variety
of purposes due to its enzymatic properties. Here are some of the main uses and purposes of HRP:
1. Detection and Assays: HRP is commonly used as a label in enzyme -linked immunosorbent assays
(ELISAs) and Western blotting. It is attached to secondary antibodies, and upon binding to the primary
antibody, it can catalyze a colorimetric or chemiluminescent reaction to produce a detectable signal.
2. Histochemistry and Cytochemistry: HRP is used for the detection of specific antigens in tissues and 13. History of PET
cells. When HRP is linked to an antibody, it can be used to localize the antibody's target within a cell or • Inception: Emission and transmission tomography by Kuhal and Edwards in the 1950s.
tissue. • Development: Ido's synthesis of 18-F FDG; Robertson and Cho's ring system prototype.
3. Electron Microscopy (EM): HRP can be used as an electron-dense tracer in transmission electron • Projection X-ray (Radiography)
microscopy (TEM). After being linked to an antibody, HRP can be visualized at high resolution due to the • X-ray Computed Tomography (CT) 14. PET Principles
formation of an electron-dense reaction product with substrates like diaminobenzidine (DAB).
• Medical Ultrasonography (Ultrasound) • Positron Emission: Proton to neutron transformation, emission of a positron.
4. Localization of Proteins: In immuno-electron microscopy, HRP is used to localize proteins within cells or • Annihilation: Positrons annihilate with electrons, producing two 511 keV gamma photons.
• Nuclear Imaging (SPECT, PET)
tissues. The enzyme can be reacted with DAB and hydrogen peroxide (H2O2) to produce an insoluble • Positive beta decay to emits a positron ans an antiparticle of the electron with opposite charge.
precipitate that is visible in EM. • Magnetic Resonance Imaging (MRI)
5. Signal Amplification: HRP can be used to amplify signals in various assays due to its catalytic activity, • Functional Imaging (fMRI, MEG, EEG, PET)
which allows for a single enzyme molecule to convert multiple substrate molecules. 6. X-Rays
6. In Situ Hybridization (ISH): HRP can be used as a label for the detection of specific nucleic acid • Detected as particles of energy (photons).
sequences in tissue sections or cells, providing a way to visualize gene expression patterns. • Discovered by Wilhelm Conrad Roentgen in 1895.
7. Affinity Labeling: HRP can be used in affinity labeling techniques where it is covalently attached to a 7. X-ray Generator System
molecule of interest, allowing for the specific labeling and detection of target molecules. • Cathode and anode within an evacuated glass tube.
8. Neuroanatomy: HRP has been used as a tract-tracing tool in neuroanatomy to map neural pathways by • High voltage generates X-rays.
being taken up by neurons and transported along their axons.
9. Reporter Enzyme: In molecular biology, HRP can be used as a reporter enzyme in various assays to
indicate the presence or activity of other molecules of interest.
10. Bioremediation: HRP is also being studied for its potential use in bioremediation, where it can catalyze the
breakdown of environmental pollutants.
The versatility of HRP stems from its stability, ease of conjugation to other molecules, and its ability to catalyze
reactions that produce easily detectable products.
MiniSOG
MiniSOG is a genetically encoded tag derived from the phototropin protein found in plants, which has been
engineered for use in fluorescence microscopy and correlated light and electron microscopy C ( LEM). Here are
the key characteristics and uses of MiniSOG:
1. Size and Origin: MiniSOG is a smaller version of the Singlet Oxygen Generator (SOG) protein, hence the
name "mini." It is derived from Arabidopsis thaliana phototropin, a plant blue -light receptor.
2. Fluorescence: MiniSOG is fluorescent, allowing it to be used for live cell imaging and fluorescence
microscopy before being used for EM.
3. Singlet Oxygen Generation: When illuminated with blue light, MiniSOG catalyzes the conversion of
molecular oxygen to singlet oxygen, a highly reactive form of oxygen.
4. Photooxidation: The singlet oxygen generated by MiniSOG can catalyze the polymerization of electron - An X-ray generator is a critical component of an X-ray imaging system. Its primary function is to
dense precursors like diaminobenzidine (DAB) into a product that is osmiophilic (attracts osmium), which produce X-rays, which are a form of high-energy electromagnetic radiation that can penetrate
enhances contrast in electron microscopy. various materials, including the human body. Here's a detailed explanation of the function and
5. Correlated Light and Electron Microscopy (CLEM): MiniSOG is particularly useful for CLEM, where it components of an X-ray generator:
allows for the correlation of light microscopic images with high -resolution electron microscopic images. It Components of an X-ray Generator
bridges the resolution gap between light and electron microscopy. 1. X-ray Tube: The tube is where the actual production of X-rays takes place. It typically 15. Radionuclides and Radiotracers in PET
6. Protein Tagging: MiniSOG can be genetically fused to proteins of interest, enabling the localization of consists of a vacuum-sealed glass or metal envelope containing a cathode (electron source) • Common Isotopes: Carbon-11, Nitrogen-13, Oxygen-15, Fluorine-18.
these proteins within cells or tissues at the electron microscopic level. and an anode (target for the electrons). • FDG (Fluorodeoxyglucose): Most used in clinical PET for oncology and neurology.
7. Ultrastructural Preservation: The photooxidation reaction catalyzed by MiniSOG helps to preserve 2. Cathode: The cathode, often in the form of a heated filament, emits electrons when heated. • Metabolic trapping of FDG
cellular ultrastructure, which is crucial for detailed cellular and molecular studies. These electrons are then accelerated towards the anode. Fluorodeoxyglucose (FDG) is a radiotracer that is widely used in positron emission tomography (PET)
8. 3D Localization: When used in conjunction with electron tomography, MiniSOG can provide three - 3. Anode: The anode is usually made of a high atomic number material, such as tungsten, imaging, particularly for oncology. It is an analogue of glucose, the primary source of energy for most cells
dimensional information about the localization of the tagged protein within the cell. which can withstand the high energy impact of the electrons. The anode is designed to rotate in the body. The metabolic trapping of FDG is a process that allows PET scans to visualize areas of
9. Multiplexing: MiniSOG can be used in combination with other tags or fluorophores to visualize multiple or oscillate to dissipate heat, as the process of X-ray generation generates a significant increased glucose metabolism, which is a common characteristic of many cancer cells.
proteins or structures simultaneously. amount of it. Here's how metabolic trapping of FDG works:
10. Research Applications: MiniSOG has been used to study various cellular processes, including protein 4. High Voltage Supply: This component provides the necessary high voltage (tens of 1. Administration: FDG, labeled with the positron-emitting isotope fluorine-18, is injected into the
trafficking, organelle structure, and cellular signaling pathways. thousands to millions of volts) to accelerate the electrons from the cathode towards the patient's bloodstream.
The development of MiniSOG and its application in CLEM has significantly advanced the field of cellular anode.
2. Transport: Like glucose, FDG is transported into cells via glucose transport proteins, which are often
microscopy by providing a tool to localize proteins with high precision in both light and electron microscopy. 5. Control Unit: The control unit manages the operation of the X-ray generator, regulating the overexpressed in rapidly growing cancer cells.
voltage, current, and exposure time to produce the desired X-ray output.
3. Phosphorylation: Once inside the cell, FDG is phosphorylated by the enzyme hexokinase to form
APEX/APEX2 6. Cooling System: Since the generation of X-rays is a heat-intensive process, a cooling FDG-6-phosphate. This is a key step in the metabolic pathway of glucose and occurs rapidly in cells
APEX (Ascorbate Peroxidase) and its improved version APEX2 are engineered enzymes derived from the system is often integrated into the design to prevent overheating and maintain the stability and with high metabolic activity.
longevity of the X-ray tube.
bacterial enzyme ascorbate peroxidase. These enzymes have been adapted for use as tags in biological 4. Trapping: Unlike glucose-6-phosphate, FDG-6-phosphate is not a substrate for further metabolism. It
research, particularly for protein localization and imaging in electron microscopy (EM). Here are the key features Functioning of an X-ray Generator cannot be converted into glucose-1-phosphate or other metabolites that can enter the glycolysis
and applications of APEX and APEX2: 1. Electron Emission: When the X-ray generator is activated, the cathode heats up and emits a pathway or the glycogen synthesis pathway. As a result, FDG-6-phosphate becomes trapped within
1. Enzymatic Activity: Both APEX and APEX2 are peroxidases, which means they can catalyze the stream of electrons. the cell.
oxidation of various substrates in the presence of hydrogen peroxide (H2O2). 2. Acceleration: The high voltage supply accelerates these electrons to a high speed towards 5. Accumulation: Because FDG-6-phosphate is not metabolized further, it accumulates in cells that
2. Genetic Fusion: APEX and APEX2 can be genetically fused to proteins of interest, allowing for the the anode. have taken up FDG, with higher accumulation in cells with higher metabolic rates.
localization of these proteins within cells or tissues. 3. Impact and X-ray Production: As the high-speed electrons hit the anode, the sudden 6. Detection: The radioactive decay of fluorine-18 in FDG-6-phosphate results in the emission of
3. Electron Microscopy: The primary use of APEX and APEX2 is in EM, where they are used to generate deceleration causes the conversion of some of the kinetic energy of the electrons into X-ray positrons, which, upon annihilation with electrons, produce gamma photons that are detected by the
electron-dense labels that enhance contrast and allow for high -resolution imaging of the tagged proteins. photons, a process known as bremsstrahlung (braking radiation). Additionally, some X-rays PET scanner.
4. Activity in Cytosol: Unlike some other peroxidases, APEX2 is active in the cytosol, which makes it a can be produced through characteristic X-ray emission when an electron fills a vacancy in an 7. Imaging: The PET scanner uses the detected gamma photons to create an image that reflects the
versatile tool for labeling a wide range of cellular structures. inner shell of an atom in the anode material. distribution of FDG in the body. Areas with high FDG uptake, indicating high metabolic activity, appear
5. Labeling Method: APEX and APEX2 catalyze the conversion of electron -dense precursors, such as 4. X-ray Beam: The produced X-ray beam exits the X-ray tube and is directed towards the as "hot spots" on the PET image.
diaminobenzidine (DAB), into an insoluble, brownish precipitate that is visible in EM. This reaction produces patient. The intensity and energy of the X-ray beam can be adjusted by varying the voltage 8. Application: This process is particularly useful for detecting tumors, as cancer cells often have
a label that marks the location of the APEX/APEX2 fusion protein. and current supplied to the X-ray tube. increased glucose metabolism compared to surrounding normal tissues. FDG -PET is also used to
6. Correlated Light and Electron Microscopy (CLEM): APEX2 can be used in CLEM experiments, where it 5. Imaging: The X-ray beam passes through the patient's body. Dense materials, such as stage cancers, evaluate treatment response, and monitor for recurrence.
provides a bridge between light microscopy and EM, allowing for the correlation of protein locations bones, absorb more X-rays and allow fewer to pass through, while less dense materials, such The metabolic trapping of FDG is a critical aspect of its utility in PET imaging. It allows for the visualization
observed in light microscopy with ultrastructural details resolved by EM. as soft tissues and air, absorb less and allow more X-rays to pass through. This differential of glucose metabolism, which can highlight areas of abnormal activity that may be indicative of disease,
7. 3D Localization: When used in conjunction with electron tomography, APEX2 can provide three - absorption creates an image that can be captured on a detector or film, providing a visual especially cancer. However, it's important to note that not all increased FDG uptake is due to cancer; other
dimensional information about the localization of the tagged protein within the cell. representation of the internal structures of the body. conditions with increased metabolic activity, such as inflammation, can also result in increased FDG uptake
8. High Specificity and Sensitivity: APEX2 offers high specificity and sensitivity for protein localization, 6. Safety: The control unit ensures that the X-ray exposure is limited to the necessary diagnostic and must be considered in the interpretation of PET scans.
allowing for the detection of even low-abundance proteins. range, minimizing the patient's exposure to ionizing radiation.
In summary, the X-ray generator plays a crucial role in medical imaging by producing X-rays that 16. PET Probes for Biological Imaging
9. Research Applications: APEX and APEX2 have been used to study various cellular processes, including
protein trafficking, organelle structure, and cellular signaling pathways. can penetrate the body and create images of internal structures. The generator's components and • Applications: Hemodynamic parameters, substrate metabolism, protein synthesis, enzyme activity, drug
control systems allow for precise control over the X-ray output to achieve the best possible action, receptor affinity, neurotransmitter biochemistry, gene expression.
10. APEX2 Improvements: APEX2 was engineered to have improved properties over the original APEX,
including higher stability, increased activity, and better solubility, making it a more reliable and user -friendly diagnostic images while minimizing radiation exposure. • Chelating agent: An organic compound that binds to charged ions to increase absorption.
tool for cellular imaging. ○ EDTA, ethilenediamine, phosphite, DOTA, DTPA
The development of APEX and APEX2 has significantly advanced the field of cellular microscopy by providing a 8. Radiography Process
tool to localize proteins with high precision in both light and electron microscopy, thereby facilitating a deeper • Conventional vs. Digital Radiography. 17. Benefits and Limitations of PET Scan
understanding of cellular structures and functions. Conventional radiography and digital radiography are two methods used to create images of the • Benefits: Unique information, cost-effective over surgery, early disease detection, low radiation exposure.
internal structures of the body, primarily using X-rays. They differ in the way the images are • Limitations: Time-consuming, lower resolution, high equipment costs, false results with chemical
captured, processed, and viewed. Here's a comparison of the two: imbalances, timing sensitivity.
Conventional Radiography 18. PET Applications
1. Imaging Process: Conventional radiography uses X-ray film that is coated with a light-
sensitive silver bromide emulsion. When the film is exposed to X-rays, the silver bromide • Cancer: FDG-PET for diagnosis, staging, and treatment monitoring.
grains are reduced to metallic silver, creating a latent image. • Neuroimaging: Brain activity in dementia, cognitive neuroscience, psychiatry.
AgBr (latent image) + developer→Ag (metallic silver) + Br− + developer oxidation products • Cardiology: Detection of "hibernating myocardium" and atherosclerosis.
2. Development: After exposure, the film is developed in a darkroom using chemicals that react • Infectious Diseases: Imaging bacterial infections using FDG.
with the exposed silver bromide to produce a visible image. The unexposed silver bromide is
removed during the development process. 19. PET/CT and PET/MRI Scans
3. Viewing: The developed film is then viewed as a physical print, often on a lightbox for better • PET/CT: Combines PET's functional data with CT's anatomic detail.
visualization of the image. • PET/MRI: Hybrid technology merging MRI's soft tissue imaging with PET's metabolic imaging.
4. Storage: The film images must be stored physically, which requires space and can be subject
to degradation over time. 20. PET/CT and PET/MRI Advantages and Challenges
5. Quality: The quality of the image can be affected by the development process and the quality • Advantages: Better localization, increased diagnostic accuracy, decreased scan time.
of the film itself. • Challenges: High costs, difficulty in producing and transporting short -lived radiopharmaceuticals, longer
6. Dynamic Range: Conventional radiography has a limited dynamic range, which can make it acquisition times for PET/MRI.
challenging to visualize structures with varying densities within the same image.
7. Archiving and Retrieval: Retrieval of images can be time-consuming, as physical files must
be located and retrieved.
8. Cost: Initially, the cost of conventional radiography may be lower, but the ongoing costs of
film, chemicals, and storage can add up over time.
Digital Radiography (DR)
Cryo Fixation and Cryo-Electron Tomography (cryoET): 1. Imaging Process: Digital radiography uses a detector, such as a photostimulable phosphor
plate (PSP) or a solid-state detector, to capture the X-ray image. The detector converts the X-
ray energy into digital data.
2. Data Capture: The detector is exposed to the X-ray beam, and the resulting latent image is
stored as digital data.
3. Processing: The digital data is then processed by a computer, which can manipulate the
image to enhance contrast and clarity.
4. Viewing: The digital image can be viewed on a computer monitor or other digital display
devices immediately after processing.
5. Storage: Digital images can be stored electronically, saving space and allowing for easier
retrieval and transfer.
6. Quality: Digital radiography offers high-quality images with the ability to adjust the contrast
and brightness after the image has been captured. It also has a wider dynamic range,
allowing for better visualization of different tissue densities.
7. Post-Processing: Digital images can be post-processed to enhance certain features, such as
zooming in on specific areas or adjusting the contrast to better visualize certain structures.
8. Archiving and Retrieval: Digital images can be easily archived and retrieved electronically,
making them more accessible and reducing the risk of loss or damage.
9. Cost: While the initial investment in digital radiography equipment may be higher, the long-
term costs can be lower due to reduced consumables and the ability to manipulate images
without the need for retakes.
10. Teleradiology: Digital radiography facilitates teleradiology, where images can be sent
electronically to radiologists for interpretation, regardless of geographic location.
In summary, digital radiography offers several advantages over conventional radiography, including
immediate image availability, better image quality and manipulation capabilities, space savings from
electronic storage, and the potential for lower long-term costs. However, it requires a greater initial
investment in equipment and relies on having the necessary digital infrastructure in place.
1. Blocked Designs:
○ Involve performing a task for a block of time followed by a rest period or a block of a different task.
○ Useful for examining state changes and detecting brain activations associated with a particular state.
2. Event-Related Designs:
○ Tasks or stimuli are presented as discrete events, and the BOLD response to each event is analyzed.
○ Powerful for estimating the time course of brain activity and determining baseline activity.
○ Allows for post hoc trial sorting and is well-suited for tasks with variable timing.
3. Mixed (Block-Event) Designs:
○ Combine elements of both blocked and event-related designs.
○ Can be used to analyze both state-dependent and item-related effects.
○ More complex to analyze but provide a comprehensive picture of brain activity.
4. Randomized Designs:
○ Events or trials are presented in a random order to reduce the predictability of the task and minimize
MRI Image Formation confounds related to temporal autocorrelation.
• Involves converting frequency domain data (k-space) to an image using an Inverse Fourier Transform (IFT). 5. Cognitive Paradigm Designs:
○ Tailored to specific cognitive processes, such as memory, attention, or language, and may incorporate
a variety of tasks or stimuli to investigate these processes.
6. Resting-State Designs:
○ Participants are instructed to rest while their brain activity is monitored; used to study the functional
connectivity between brain regions in the absence of an explicit task.
23. EEG-fMRI:
• A multimodal technique recording EEG and fMRI data synchronously to study electrical brain activity and its
correlation with haemodynamic changes.
Hemodynamic activity refers to the various physiological processes related to the circulation of blood and
the delivery of oxygen and nutrients to tissues, as well as the removal of waste products like carbon dioxide.
In the context of the brain and functional MRI (fMRI), hemodynamic activity is particularly important because
it underlies the BOLD (blood-oxygen-level dependent) signal that fMRI measures. Here's a breakdown of the
key components of hemodynamic activity in the brain:
1. Blood Flow: The delivery of blood to the brain is regulated by the cardiovascular system. Blood flow to
a particular brain region can increase or decrease based on the metabolic demands of that region.
2. Oxygenation: Oxygen is carried to the brain by hemoglobin in red blood cells. The oxygenated blood
delivers oxygen for the metabolic needs of the brain tissue.
MRI Contrast Agents 3. Neurovascular Coupling: This is the relationship between neural activity (electrical activity of
• Used for more specific imaging, with gadolinium-based chelates being common intravenous agents. neurons) and the hemodynamic response (changes in blood flow and oxygenation). When neurons in a
MRI contrast agents are substances that are introduced into the body to enhance the visibility of internal brain region are active, they require more oxygen and nutrients, leading to an increase in blood flow to
structures in MRI scans. They work by altering the local magnetic environment, which changes the relaxation that area.
times (T1, T2, and T2*) of nearby water molecules, resulting in differences in signal intensity on the MRI 4. Hemoglobin States: Hemoglobin can exist in two main states:
images. Here are the main functions and benefits of using MRI contrast agents: ○ Oxyhemoglobin (HbO2): When bound to oxygen, it is diamagnetic, meaning it doesn't
1. Improved Visualization: Contrast agents increase the contrast between different types of tissues, significantly distort magnetic fields.
making it easier to distinguish between normal and abnormal tissues. ○ Deoxyhemoglobin (dHb): Without oxygen, it becomes paramagnetic, which means it can distort
2. Detection of Pathology: They can highlight areas of the body that may not be visible or clearly defined local magnetic fields and is the primary source of the BOLD contrast in fMRI.
without enhancement, such as tumors, areas of inflammation, or blood vessels with blockages. 5. Cerebral Metabolism: The brain's metabolic rate of oxygen (CMRO2) refers to how much oxygen is
3. Characterization of Lesions: By changing the appearance of lesions on MRI scans, contrast agents can consumed per unit of time for metabolic processes in the brain.
help determine if a lesion is benign or malignant, or differentiate between recurrent tumor and scar tissue 6. Venous Drainage: After oxygen is delivered and utilized, deoxygenated blood is collected in veins and
post-surgery. eventually drained from the brain.
4. Vascular Imaging: Contrast agents are used in MR angiography (MRA) to provide detailed images of 7. Autoregulation: The brain has mechanisms to maintain a relatively constant blood flow, despite
blood vessels, which can be used to diagnose aneurysms, atherosclerosis, or vascular malformations. changes in blood pressure, through autoregulation.
5. Functional MRI (fMRI): Although not typically referred to as "contrast agents," changes in blood In fMRI, the BOLD signal is sensitive to changes in the concentration of deoxyhemoglobin. An increase in
oxygenation levels are used to highlight areas of the brain that are active during specific tasks or stimuli in neural activity in a brain region leads to a local increase in blood flow and oxygen delivery, which exceeds
functional MRI. the immediate metabolic demand. This results in a relative increase in oxyhemoglobin and a decrease in
6. Perfusion Imaging: MRI contrast agents can be used to measure blood flow and blood volume in deoxyhemoglobin, causing a more uniform magnetic field and an increase in the MRI signal强度 (signal
tissues, which is useful in the evaluation of brain disorders, tumor grading, and assessment of tissue intensity). This change in the MRI signal is used to infer brain activity.
viability. Understanding hemodynamic activity is crucial for interpreting fMRI data because the BOLD signal is an
7. Molecular Imaging: Targeted MRI contrast agents are being developed to bind to specific biological indirect measure of neural activity, primarily reflecting the vascular response to changes in brain metabolism.
markers, potentially allowing for the imaging of specific cellular or molecular processes.
8. Safety and Relative Ease of Use: Most MRI contrast agents are based on chelates of gadolinium, which 24. Types of Imaging:
are generally considered safe. They are administered intravenously and are usually well -tolerated by • CT, US, MRI, Nuclear, and Optical imaging each focus on different aspects of body function and structure,
patients. such as anatomy, physiology, metabolism, and molecular processes.
9. Complementary to Other Imaging Modalities : Contrast agents can provide additional information that
may not be available through other imaging techniques like X -ray, CT, or non-contrast MRI.
10. Guidance for Biopsy and Interventions : Enhanced MRI images can be used to guide minimally
invasive procedures, such as biopsies or the placement of catheters.
The statement provided highlights various imaging modalities, each with unique capabilities for examining
different aspects of body function and structure. Here are examples of the processes that each imaging type
can visualize:
1. CT (Computed Tomography):
○ Focus: Primarily used for detailed images of the body's anatomy, especially bones, hard tissues,
and some soft tissues.
○ Example Process: CT can create cross-sectional images of the head to visualize the skull, brain
Random Darkening
1. Definition: Random darkening, also known as film fogging or developer fog, refers to the
appearance of a uniform, undesired darkening of the film that is not related to the exposure to
the X-ray radiation.
2. Cause: This type of noise can be caused by various factors, including:
○ Chemical reactions within the film emulsion that occur even in the absence of radiation,
Negative stain electron microscopy (negative stain EM) is a technique used to visualize biological samples such as such as those caused by heat or humidity.
macromolecules, macromolecular complexes, and viruses by enhancing their contrast in the electron microscope. The ○ Mishandling of the film, which can lead to unwanted reactions with the emulsion.
term "negative stain" refers to the method of applying a heavy metal salt, which preferentially stains the background ○ Defects or contamination in the film or the processing chemicals.
rather than the biological sample itself, creating a dark background with lighter regions where the sample is located. 3. Effect on Image: Random darkening can reduce the overall contrast of the image and make it
This results in the sample appearing as a "negative" or inverse image against the darkly stained background. more difficult to distinguish between different structures. It can also introduce a bias towards
Here are the key aspects of negative stain EM: darker tones, which can affect the interpretation of the image.
1. Sample Preparation: The biological sample is applied to a supportive surface, typically a carbon-coated copper 4. Reduction: To reduce random darkening, it is important to:
grid. ○ Store the film under proper conditions (e.g., cool, dry, and away from light).
2. Staining Process: After the sample is adhered to the grid, the excess liquid is wicked away, and a drop of a ○ Handle the film carefully to avoid physical damage or contamination.
heavy metal salt solution (the negative stain) is applied. Commonly used stains includeuranyl acetate, uranyl ○ Use fresh, properly mixed processing chemicals and follow the recommended
formate, or ammonium molybdate. processing times and temperatures.
3. Drying: The grid is then tilted or blotted to remove the excess stain, leaving the sample surrounded by a layer of In summary, quantum mottle and random darkening are two distinct sources of noise that can affect
the heavy metal salt. the quality of X-ray images. Quantum mottle is a fundamental limitation of photon -based imaging
4. Drying and Vitrification: The grid is allowed to dry, and the sample is vitrified if it is in a thin layer of water. This due to the quantum nature of light, while random darkening is an undesired effect that can be
can help to maintain the native conformation of the sample. minimized through careful handling and processing of the film. Both types of noise can reduce the
5. Imaging: The stained, dried sample is then examined in a transmission electron microscope. The heavy metal contrast and visibility of the image, making it more challenging to diagnose conditions based on the
salt scatters electrons more strongly than the biological material, creating a dark background that makes the radiographic image.
biological sample visible as a lighter region.
6. Advantages: Negative staining is a relatively quick and straightforward technique that does not require cryogenic 12. Image Quality Factors
temperatures or high-vacuum conditions. It is also useful for studying samples that are difficult to crystallize, such • Source: energy of photons, collimation.
as large macromolecular complexes or viruses. • Object: attenuation coefficient, source-object geometry.
7. Limitations: The resolution of negative stain EM is typically limited to about 15-20 Å (1.5-2 nm) due to the • Detector: object-detector geometry, efficiency.
staining process and the possibility of the sample shrinking away from the stain. Additionally, the technique can
sometimes introduce artifacts, such as distortion or flattening of the sample, and it does not provide 3D structural 13. X-Ray Radiography Clinical Applications
information. • Chest, skeletal, abdomen, dental assessments.
8. Applications: Negative stain EM is widely used for the initial characterization of macromolecular structures, for • Advantages: inexpensive, low radiation, good contrast.
studying the morphology of viruses, and for assessing the homogeneity and stability of biological samples. • Disadvantages: poor soft tissue differentiation, 2D only.
Negative stain EM is a valuable tool in structural biology, providing insights into the shape, size, and surface features
of macromolecules and complexes that are not readily obtained by other methods. It can also be a complementary 14. X-ray Computed Tomography (CT)
technique to cryo-electron microscopy, especially when studying samples that may not be suitable for crystallization or • Cross-sectional images from multiple X-ray images taken at different angles.
when high-resolution structural information is not required.
• History: Radon transform, Allan Cormack, Godfrey Hounsfield.
Differences between negative stain and cryo EM • From 128x128 matrix to 512x512matrix
Negative stain electron microscopy (negative stain EM) and cryo-electron microscopy (cryo-EM) are both techniques CT Scanning and Electromagnetic Energy
used to visualize biological samples at high resolution, but they differ in several key aspects, including sample 1. X-ray Source: In a CT scan, an X-ray source emits a beam of electromagnetic energy, which
preparation, the state of the sample, and the resulting image quality. Here's a comparison of the two methods: consists of X-ray photons.
Negative Stain EM 2. Rotation: The X-ray source and an array of detectors rotate around the patient, capturing
1. Sample State: The sample is dried onto a support grid, often after adsorption to a carbon film. images from multiple angles. This rotation ensures that the X-rays pass through the patient
2. Staining: Uses heavy metal salts (like uranyl acetate or ammonium molybdate) that stain the background, from various directions.
leaving the biological sample appearing lighter by contrast. 3. Attenuation: As the X-rays pass through the patient's body, they are attenuated (lessened in
3. Sample Preparation: Relatively straightforward; the sample is applied to a grid, excess liquid is wicked away, intensity) by different amounts depending on the density of the tissues they encounter. Bones,
and the negative stain is added. for example, absorb more X-rays than soft tissues or air.
4. Hyddration: The sample is dehydrated, which can cause conformational changes or collapse of the structure. 4. Detection: The detectors on the opposite side of the patient from the X-ray source measure
the attenuated X-ray energy. Because the X-rays are coming from different angles, the
5. Resolution: Generally limited to about 15-20 Å (1.5-2 nm) due to the staining process and potential sample
detectors can gather a range of data about the internal structures.
distortion.
5. Data Collection: The intensity of the X-ray beam is measured at multiple points around the
6. Artifacts: Can introduce artifacts such as flattening or shrinkage of the sample.
body, creating a set of data for each angular position.
7. Imaging: The sample is imaged in a non-cryogenic, or room temperature, state. 6. Reconstruction: The collected data is then used by a computer to reconstruct a 2D cross-
8. Applications: Suited for initial structural characterization, morphology studies, and assessing sample sectional image (slice) of the patient's anatomy. This process involves complex mathematical
homogeneity. algorithms, such as the filtered back projection or more advanced iterative reconstruction
Cryo-EM techniques.
1. Sample State: The sample is maintained in a near-native, hydrated state by rapid freezing (vitrification) to form a 7. Series of Slices: By repeating the process while incrementally moving the patient through the
glass-like ice layer. scanner, a series of 2D images can be obtained. These can be combined to create a 3D
2. Staining: No staining is used; the sample is imaged as-is within the ice. representation of the area of interest.
3. Sample Preparation: More complex, involving plunge freezing or high-pressure freezing to prevent ice crystal 8. Contrast Agents: In some cases, a contrast agent (such as iodine for vascular studies or
formation, which could damage the sample. barium for gastrointestinal studies) may be introduced into the patient's body to enhance the
differences between tissues and improve the visibility of specific structures.
4. Hydration: The sample remains hydrated, which helps to preserve its native conformation.
5. Resolution: Can achieve near-atomic resolution (better than 2 Å) for well-prepared, homogeneous samples. Benefits of Using Electromagnetic Energy from All Angles
6. Artifacts: Fewer artifacts related to staining; however, radiation damage and the "missing wedge" problem can • Detailed Imaging: The use of X-rays from multiple angles allows for the creation of detailed,
be challenges. high-resolution images that can reveal subtle differences in tissue density.
7. Imaging: The sample is imaged under cryogenic conditions to maintain the vitreous state. • Cross-Sectional Views: CT scans provide cross-sectional images, which are particularly
8. Applications: Ideal for determining high-resolution 3D structures of macromolecules, complexes, and even large useful for visualizing internal structures that may be obscured in traditional planar X-ray
cellular structures. images.
• 3D Reconstruction: The series of 2D images can be used to generate 3D models of the
body's internal structures, which can be valuable for surgical planning and other applications.
Negative stain Vitrified sample
• Diagnostic Accuracy: The comprehensive data obtained from multiple angles can lead to
High contrast image Low contrast image more accurate diagnoses and better patient outcomes.
No special temepearture control 85K(-180 degrees) storage
15. CT Scan Operation
No radiation damage High radiation damage • 2D cross-sectional images.
Particle disorted Particle undistorted • Different body parts absorb beams differently.
Stain shell Image is of the actual particle • Contrast material enhances images.
Low RES High RES 16. Scanner Components
Good for initial sample screening Best choice for reconstruction • Gantry, detectors, array processor, computer, storage, console, scan controller.
Dried sample Frozen hydrated sample 1. Gantry: The gantry is the large, donut-shaped structure that houses the X-ray tube and the
detectors. It rotates around the patient to capture images from different angles.
High signal to noise Low signal to noise
2. Detectors: These are the devices within the gantry that detect the X-ray energy that has
passed through the patient. They convert the X-ray energy into electrical signals, which are
Single Particle Analysis (SPA) and Electron Tomography (ET): then processed into images.
• SPA: Used to determine 3D structures of non-crystalline specimens like proteins and viruses. 3. Array Processor: The array processor is a specialized hardware component that performs
• ET: Generates 3D structures of thicker cellular sections or isolated macromolecular complexes. the complex mathematical operations required to reconstruct the 2D image slices from the
data collected by the detectors.
4. Computer: The computer system manages the overall operation of the CT scanner, including
controlling the X-ray source, managing the data acquisition, and interfacing with the user.
5. Storage: This refers to the storage devices where the raw data and reconstructed images are
temporarily or permanently stored. This can include both local storage within the scanner and
networked storage systems.
6. Console: The console is the operator's interface, typically a separate workstation where the
radiographer or technician controls the scanner settings, initiates scans, and reviews the
acquired images.
7. Scan Controller: The scan controller is the component that manages the timing and
coordination of the scan, including the rotation of the gantry, the emission of the X-ray beam,
and the synchronization with the detector array.
*A Single CT image may involve approximately 800 rays taken at 1000 different projection angles.
Recent advancements in reconstruction algorithms have been developed to address these issues, aiming to reduce the
number of projection images needed to attain a certain resolution without violating the Nyquist criterion, which is a
fundamental principle in sampling theory that dictates the minimum rate at which a signal can be sampled without
introducing errors.
The "missing wedge" problem is a limitation encountered in 3D electron tomography, a technique used to obtain
three-dimensional (3D) structural information from biological samples. This issue arises due to the physical 19. CT Generations
constraints of the sample holder and the electron microscope itself, which prevent the sample from being tilted • 1st to 7th generation advancements: detector numbers, scan times, ring artifacts, helical scanning
through a full 180-degree range.
Here are the key points about the missing wedge problem: 1st Generation: Rotate/Translate (Pencil Beam)
1. Tilt Limitation: In electron tomography, a series of 2D images are acquired at different tilt angles to • Features: These early scanners had a single X-ray detector and used a pencil beam (a
reconstruct a 3D image. However, the sample can only be tilted to a certain maximum angle, typically up to narrow, focused beam) that was moved linearly to collect data across a 24 cm field of view
±60 to 70 degrees, due to the physical limitations of the microscope and the holder. (FOV). The scanner would rotate and then translate (move) the beam to cover a full 360
2. Data Inaccessibility: Because the sample cannot be tilted a full 180 degrees, there is a range of angles degrees, obtaining 180 projections at 1-degree intervals.
from which the electron microscope cannot collect data. This results in a "missing wedge" of data in the 3D • Time: It took about 4.5 minutes per scan, with an additional 1.5 minutes required for image
data set. reconstruction.
3. Anisotropic Resolution: The missing wedge leads to anisotropic resolution, where the resolution is not 2nd Generation: Rotate/Translate (Narrow Fan Beam)
uniform in all directions. The resolution is typically better along the tilt axis and worsens as it moves • Improvements: Introduced a linear array of detectors (around 30), which allowed for more
perpendicular to the tilt axis. data to be collected per rotation, improving image quality.
4. Impact on 3D Reconstruction: The missing wedge can cause structures to appear elongated or smeared • Data Collection: The system collected data for 600 rays times 540 views, significantly more
along the beam direction (z-axis), and it can also lead to a loss of information, particularly for structures than the first generation.
near the edges of the sample.
• Speed: The scan time was reduced to about 18 seconds per slice.
5. Sub-Tomogram Averaging (STA): One approach to mitigate the missing wedge problem is through STA,
where similar structures from multiple tomograms are averaged to fill in the missing information. 3rd Generation: Rotate/Rotate (Wide Fan Beam)
• Design: Featured a substantial increase in the number of detectors (to more than 800) and a
wider fan beam that covered the entire patient.
• Mechanics: The X-ray tube and detector array were mechanically linked and rotated together
around the patient, eliminating the need for the scanner to translate the X-ray source.
• Speed: Newer systems could achieve scan times of half a second or less.
4th Generation: Rotate/Stationary
• Innovation: Designed to overcome the issue of ring artifacts that were common in 3rd
generation scanners.
• Detectors: Used a stationary ring of detectors (around 4,800) that did not move with the
gantry.
5th Generation: Stationary/Stationary (Electron Beam CT)
• Technology: No conventional rotating X-ray tube; instead, it used an electron beam that was
steered around a large arc of tungsten encircling the patient, with the detectors on the
opposite side.
• Speed: Capable of extremely fast scan times, around 50 milliseconds, which allowed for
dynamic imaging, such as the beating heart.
6th Generation: Helical (Spiral)
6. Dual-Axis Tomography: Another strategy to overcome the missing wedge is to perform dual -axis or multi- • Innovation: Scanners acquired data while the patient table was continuously moving,
tilt tomography, where the sample is imaged on two orthogonal axes, providing more complete data allowing for more continuous and faster scanning.
coverage. • Advantages: Reduced the need for patient breath-holding, decreased the amount of contrast
agent required, and increased patient throughput.
• Scan Time: In some cases, the entire scan could be completed within a single breath-hold.
7th Generation: Multiple Detector Array (Wide-Detector or
Volumetric CT)
• Detectors: Utilized an increased number of detector rows, allowing for wider coverage of the
patient's anatomy with each rotation and the ability to capture a volume of data instead of just
a single slice.
• Applications: Enabled the use of adaptive statistical iterative reconstruction (ASIR)
techniques, which can reduce radiation dose while maintaining image quality.
7. Resolution Improvement: Advances in reconstruction algorithms aim to reduce the impact of the missing
wedge by improving the estimation of missing data and enhancing the overall resolution of the 3D
reconstruction.
8. Sample Thickness: The missing wedge problem is more pronounced in thicker samples, where the
increased scattering and absorption of electrons exacerbate the effects of the missing data.
Here's how CT windowing works:
Cryo FIB-milling(If target is deeep inside the cell) 1. Window Width: This is the range of Hounsfield units that the display is set to show. For
instance, a window width of 2000 HU would display values from -1000 HU to +1000 HU.
2. Window Level: This is the center point of the window width. Adjusting the window level shifts
the range of HU values that are displayed as a mid-gray. For example, if the window level is
set to 40 HU with a window width of 2000 HU, the display will show values from -980 HU to +
1030 HU.
3. Visualizing Different Tissues: By adjusting the window width and level, radiologists can
emphasize the contrast between different types of tissues. For example:
○ A narrow window width and a level set to around 40 HU is used for lung imaging (where
the contrast between air and lung tissue is important).
○ A wider window width and a level set to around 40 HU is used for mediastinum or brain
imaging (to show more detail in soft tissues).
4. Applications: CT windowing is crucial for interpreting CT scans, as it allows radiologists to
focus on specific tissues or structures within the body. It helps in diagnosing various
conditions, assessing the extent of injuries, and guiding treatment decisions.
5. Interactive Windowing: Modern CT viewing software allows radiologists to interactively
adjust the window width and level to get the best view of the anatomy or pathology in
question.
6. Dual or Multi-Windowing: Sometimes, two or more windows are displayed simultaneously,
each with different settings, to compare different tissues or to show both bones and soft
tissues effectively.
In summary, CT windowing is a vital tool in radiology that enhances the interpretability of CT images
by allowing radiologists to adjust the contrast and brightness to better visualize different tissues
within the body.
Future Perspectives:
• Advances in reconstruction algorithms.
• Multi-tilt tomography to overcome the missing wedge problem.
• Development of new labeling and staining techniques for better resolution and contrast.