JOHN M. OLLINGER AND JEFFREY A. FESSLER

Size: px
Start display at page:

Download "JOHN M. OLLINGER AND JEFFREY A. FESSLER"

Transcription

1 JOHN M. OLLINGER AND JEFFREY A. FESSLER edical imaging is often thought of as a way of viewing 1 structures of the body. Indeed, x-ray computed tomography (CT) and magnetic resonance imaging (MRI) yield exquisitely detailed images of such structures. It is often useful, however, to acquire images of physiologic function rather than of anatomy. Such images can be acquired by imaging the decay of radioisotopes bound to molecules with known biological properties. This class of imaging techniques is known as nuclear medicine imaging. The most common form of nuclear medicine scan uses a gamma-ray emitting radio-isotope bound to a chemical with known physiological properties. After it is administered, single photons emitted by the decaying isotope are detected with a gamma camera [ 11. These cameras consist of a lead collimator to ensure that all detected photons propagated along parallel paths, a crystal scintillator to convert high-energy photons to visible light, and photo-multiplier tubes and associated electronics to determine the position of each incident photon from the light distribution in the crystal. A two-dimensional (2D) histogram of the detected events forms a projection image of the distribution of the radio-isotope and hence of the chemical compound. An example of such a procedure would be a cardiac study using thallium-201. Image intensity is indicative of cardiac perfusion and can be used to diagnose defects in the blood supply. It is widely used to screen for bypass surgery. Planar imaging with gamma cameras has three major shortcomings. First, the images are projection images, so the organ of interest can be obscured by activity in front of or behind the organ of interest. Moreover, photons originating in the organ of interest can be attenuated by overlying tissue. This is a problem, for example, in scans of obese women, where attenuation in the breast can be misinterpreted as a cardiac defect. Second, the radiopharmaceuticals must incorporate relatively heavy isotopes such as thallium- 201 and technetium-99m. Since these elements do not occur naturally in biologically active molecules, the synthesis of physiologically useful tracers incorporating them is a challenging technical problem. This restricts the number of available radiopharmaceuticals. Finally, the lead collimator absorbs many photons, thereby sitivity of the camera. being addressed. The 2jection imaging can be uiring tomographic data amma camera and then attenuation in a to-,truction. This method is ton emission computed tomography (SPECT) [ 11. Continuing research in radiochemistry has made more radiopharmaceuticals available. Finally, newer SPECT cameras with two or three rotating heads have improved the sensitivity. Nevertheless, singlephoton imaging still suffers from problems of poor sensitivity and poor quantitative accuracy. In this article we review positron-emission tomography (PET), which has inherent advantages that avoid these shortcomings. PET image reconstruction methods with origins in signal and image processing are discussed, including the potential problems of these methods. A summary of statistical image reconstruction methods, which can yield improved image quality, is also presented. Advantages of PET One of the advantages of PET that allow it to avoid the above-mentioned shortcomings is that attenuation correction is easily accomplished in PET. Also, positron-emitting isotopes of carbon, nitrogen, oxygen, and fluorine occur natu- JANUARY 1997 IEEE SIGNAL PROCESSING MAGAZINE /97/$1O.OO@1997IEEE 43

2 rally in many compounds of biological interest and can therefore be readily incorporated into a wide variety of useful radiopharmaceuticals, and collimation is done electronically, so no collimator is required, leading to relatively high sensitivity. The major problem with PET is its cost. The short half-life of most positron emitting isotopes requires an on-site cyclotron, and the scanners themselves are significantly more expensive than single-photon cameras. Nevertheless, PET is widely used in research studies and is finding growing clinical acceptance, primarily for the diagnosis and staging of cancer. A PET study begins with the injection or inhalation of a radiopharmaceutical. The scan is begun after a delay ranging from seconds to minutes to allow for transport to and uptake by the organ of interest. When the radio-isotope decays, it emits a positron, which travels a short distance before annihilating with an electron. This annihilation produces two high-energy (5 11 kev) photons propagating in nearly opposite directions. If two photons are detected within a short (-10 ns) timing window (the coincidence timing window), an event (called a true coincidence if neither photon is scattered) is recorded along the line connecting the two detectors (sometimes referred to as a line of response (LOR)). Summing many such events results in quantities that approximate line integrals through the radio-isotope distribution. The validity of this approximation depends, of course, on the number of counts collected. For 2D imaging, these line integrals form a discrete approximation of the Radon transform [3] of acrosssection of the radio-isotope concentration and can be inverted to form an image of the radioisotope distribution. If they are suitably calibrated, PET images yield quantitative estimates of the concentration of the radiopharmaceutical at specific locations within the body. The kinetics of the pharmaceutical can be modeled as a linear dynamic system with the arterial concentration of radio-isotope in the blood as the input and the PET measurement as the output, The state variables are the concentrations in different compartments of the tissue, where examples of compartments would be blood, the interstitial space between cells, and the interiors of cells. Compartments need not be related to physical spaces and can represent, for example, bound and unbound states of the radiopharmaceutical. The exchange rates between the compartments are parameters of the models. Acquiring a series of images sequentially after injection yields a time-course of the sum of the quantity of tracer in each compartment, i.e., of the output of the model, which can be used to estimate the model s parameters. These parameters can then be used to calculate physiological parameters of interest, such as blood flow, glucose metabolism, receptor binding characteristics, etc. Thus, PET can be used for precise quantitative measurements of specific physiological quantities. The Physics of PET A diagram of a PET scanner is shown in Fig. 1. The subject is surrounded by a cylindrical ring of detectors with a diameter of cm and an axial extent of cm. The 1. A transaxial view of a PETscanner (upper panel) top view (lower panel) showing the rod sources used for attenuation correction (A), the septa used for scatter reduction (B), the detector blacks comisting of crystuls (C) and photomultiplier tubes (D), and the end-shields (E). detectors are shielded from radiation from outside the field of view by relatively thick, lead end-shields. Most scanners can be operated in either a slice-collimated mode, where axial collimation is provided by thin annular rings of tungsten called septa, or in a fully three-dimensional (3D) mode where the septa are retracted and coincidences can be collected between all possible detector pairs. (All commercially available PET scanners simultaneously acquire data for 3D images, either by imaging the entire volume as a unit or by stacking adjacent 2D slices.) Detectors The most critical components of a PET camera are the detectors [4]. In some cases these are similar to those used in single-photon imaging: large crystals of sodium-iodide coupled to many photo-multiplier tubes (PMTs) [5]. A more commonly used configuration is shown in Fig. 2. In these detectors arectangular bundle of crystals, a block, is optically coupled to several PMTs. When a photon interacts in the crystal, electrons are moved from the valence band to the conduction band. These electrons return to the valence band at impurities in the crystal, emitting light in the process. Since the impurities usually have metastable excited states, the light output decays exponentially at a rate characteristic of the crystal. The ideal crystal has high density so that a large fraction of incident photons scintillate, high light output for 44 IEEE SIGNAL PROCESSING MAGAZINE JANUARY 1997

3 positioning accuracy, fast rise time for accurate timing, and a short decay time so that high counting rates can be handled. Most current scanners use bismuth-germanate (BGO), which generates approximately 2500 light photons per 5 11 kev photon, and has a decay time (i.e., time-constant) of 300 ns. One such block, for example, couples a 7x8 array of BGO crystals to four PMTs where each crystal is 3.3 mm wide in the transverse plane, 6.25 mm wide in the axial dimension, and 30 mm deep. The block is fabricated in such a way that the amount of light collected by each PMT varies uniquely depending on the crystal in which the scintillation occurred [4]. Hence integrals of the PMT outputs can be decoded to yield the position of each scintillation. The sum of the integrated PMT outputs is proportional to the energy deposited in the crystal. Resolution If the data are acquired in the slice-collimated (2D) mode, the LORs connecting crystals can be binned into sets of parallel projections at evenly spaced angles as shown in Fig. 3. Two characteristics are evident. First, samples are unevenly spaced, with finer sampling at the edges of the field of view than at the center. Second, the samples along the heavy solid line at angles one and three are offset by one-half of the detector spacing from samples at angle two. Therefore, adjacent parallel projections can be combined to yield one-half the number of projection angles with a sampling distance of one-half the detector width. (There is a degradation of image quality associated with this approximation, but it is imperceptible for realistic imaging situations.) A typical block might have 3.3 mm thick crystals, so the resulting sampling distance would be 1.65 mm. The Nyquist criterion is usually stated in medical imaging applications as requiring that the sampling distance be onehalf the spatial resolution expressed as the full-width-at-halfmaximum (FWHM). (The full-width-at-half-maximum is defined as the distance between the half-value points of the impulse response. It is the minimum separation required for two distinct points to be resolved.) Hence, this block would support a spatial resolution of 3.3 mm. In fact, a scanner with this crystal size has a measured resolution that is somewhat worse, varying from 3.6 mm at the center of the field of view to 5.0 mm at 20 cm from the center. This occurs because scintillations usually consist of one or more Compton interactions followed by photoelectric absorption (assuming the photon is not scattered out of the crystal). Since a 511-keV photon travels on average 7.5 mm in BGO before interacting, the light output is spatially distributed, especially at large radial distances where it is often distributed across two crystals. The best obtainable resolution is termed the intrinsic resolution. This resolution is rarely achieved in practice because unfiltered images are usually very noisy. Although current scanners have intrinsic resolutions of less than 5 mm, the final resolution of the image is usually greater than 8 mm because the reconstruction algorithms trade off resolution for reduced image variance (as discussed later in this article). This final resolution is called the reconstructed resolution. Therefore, the resolution of PET images as they are typically used is not determined by the detectors, but by the degree to which resolution must be degraded to achieve an acceptable image variance. Since the variance is determined by the numbers of counts that can be collected during the scan, the constraints that govern the clinically useful resolution of PET images are the dosage of the radiopharmaceutical, the duration of the scan, the sensitivity of the scanner, and the countrate capability of the scanner. Positron Range 2. A block detector consisting of a 7x8 array of crystals coupled to,four PMTs. When the radio-isotope decays, it emits a positron with some nonzero energy. The positron interacts witlh electrons as it travels through the body, losing energy with each interaction. When its momentum is nearly zero, it annihilates with an electron to produce two annihilation photons, each with an energy of 51 1 kev. These photons propagate along nearly collinear paths, with the degree of noncollinearity depending on the momentum of the positron and electron when they annihilated. The divergence from collinearity is on the order of one degree or less and is usually ignored. The distance the positron travels before annihilating is termed positron range. The magnitude of this range depends on the positron energy, which varies widely among isotopes. The distribution of positron ranges is very sharply peaked with IFWHMs ranging from mm and full-width-at-tenth-maximums (FWTMs) ranging from mm [4, 6) in body tissues. They are much larger in the lungs and other regions containing a significant fraction of air. Since positron range is much smaller than the resolution of most scanners, it is not a serious source of error and is usually ignored. JANUARY 1997 IEEE SIGNAL PROCESSING MAGAZINE 45

4 ~ 3. Sampling pattern in the transaxial plane for a PET scanner. Each segment in the detector ring represents one crystal. The solid lines show the parallel projections for the first angle, the dotted lines for the second angle, and the dashed lines for the third angle. Attenuation The two possible interactions at 51 1 kev are photoelecitric absorption and Compton scatter. The incidence of photoelectric absorption is negligible for 511-keV photons in body tissues. In a Compton interaction, the photon interacts with an outer shell electron such that its path is deflected anti it loses some of its energy. Most scattered photons are scattered out of the field of view and are never detected. The effect of these interactions is termed attenuation. The survival probability, i.e., the probability of a photon not interacting as it propagates along the line 1 at transverse distance d and angle 0, is given by where p(5) is the linear attenuation coefficient at position 5. Typical minimum survival probabilities are 0.15 for head scans and for body scans. The survival probability given by Eq. (1) is also referred to as the narrow-beam attenuation. The significance of Eq. (1) is that the attenuation experienced by a given pair of annihilation photons is independent of the position of the annihilation along the LOR. This makes possible a simple precorrection of the data. Equation (1) does not hold for SPECT, necessitating approximate methods or computationally expensive iterative methods of attenuation correction. This is the major reason for the relatively poor quantitative accuracy of SPECT relative to PET. Scattered Events Those annihilations for which one or both photons are scattered, but both are still detected, are termed scattered events, as shown in Fig. 4. These events are incorrectly positioned because the photons paths are not collinear. A relatively small 30-degree scatter at the center of a typical scanner mispositions the event by 10 cm. The overall effect is to add an error signal to the data at low spatial frequencies. Since photons lose a fraction of their energy when they undergo a Compton interaction, they can be discriminated from unscattered photons by measuring the energy they deposit in the crystal. This can be estimated by the sum of the integrated PMT outputs. Although this measurement is only accurate to within approximately +/- 10% on most scanners, it can be used with a simple threshold to reject a significant fraction of the scattered events. For scanners using sodiumiodide detectors [5], this accuracy improves to +/- 5%. This not only improves the effectiveness of energy discrimination, but also improves the accuracy of the scatter correction [7, 81. Accidental Coincidences Given the large number of scattered photons and the relatively small solid angle subtended by the detector ring, it is apparent that for many annihilations only one of the photons will be detected. These events are termed singles. If two singles arising from separate annihilations are detected within the same coincidence timing window, they will be recorded as shown in Fig. 4. These events are termed accidental coincidences or rundoms. The rate of accidental coincidences can be related to the singles rate by noting that for each single detected at detector i, on average %RI singles occur at detector j during the coincidence timing window T, where RI is the singles rate at detector j. Since each of these TR] singles results in a coincidence, there are %RIRJ coincidences per unit time for which the first detected photon is incident on detector i. The total number of accidental coincidences is the sum of those for which the first photon is detected at detector i and those for which the first photon is detected at detectorj. Hence, the rate of random coincidences along the LOR connecting detectors i and j is given by Examination of Eq. (2) shows that reducing the coincidence timing window reduces the counting rate of accidental coincidences. However, timing inaccuracies due to variations in the rise-time of the crystal light output require a timing window of ns for BGO. Since the incident singles rates are proportional to the amount of injected isotope, the accidental coincidence rate increases as the square of the amount of isotope in the field of view (for counting rates that do not saturate the detectors). This count-rate limitation, along with detector deadtime, determines the upper limit on the injected dose for many studies. Detector Efficiency The efficiency of photon detection varies not only from block detector to block detector, but also varies widely across the elements of a block detector [6]. Referring to Fig. 2, it is apparent that a photon scattering in a central element will 46 IEEE SIGNAL PROCESSING MAGAZINE JANUARY 1997

5 ~ probably deposit the remainder of its energy in adjacent crystals. It might not be positioned as accurately as an event that deposits all of its energy in a single crystal, but it will be detected. A photon scattering in an edge crystal, on the other hand, has a significant probability of scattering out of the entire block and not being detected at all. This results in a decrease of detection efficiency in the edge crystals relative to the center crystals. This efficiency is different for scattered and true events because scattered events have different photon energies and, for a given line of response, the scattered photons arrive from a wide range of angles while unscattered photons detected along a given LOR all arrive at nearly the same angle. Detector Deadtime The time required to process a single event limits the counting rate of a PET scanner [9]. Event processing begins with the rising edge of the pulse for the first detector involved. The pulse is integrated for some time interval, then position calculations and energy discrimination are performed. The detector is dead to new events during this time. At very low counting rates, randoms are negligible and the number of true events is linearly related to the amount of activity in the field of events. The number of randoms increase as the square of the activity in the field of view until deadtime becomes significant. Then the number of true events begins to saturate. As the counting rate increases further, the numbers of trues and randoms peak and then decline because of detector saturation. Deadtime is the dominant effect that limits the injected dose. Fully Three-Dimensional PET In the foregoing discussion we have assumed that data are collected in 2D planes. Current scanners have retractable septa so that coincidences can be acquired between all possible pairs of detectors, a mode calledfully 30 PET [lo]. The effective sensitivity of such a scanner increases by up to a factor of eight, resulting in significant reductions in either the image variance or the injected dose. This improvement comes at the cost of a large increase in the scatter fraction and singles rates (due to less shielding for annihilations inside the field of view and a larger acceptance angle for annihilations outside the field of view of the camera). Until1 recently, fully 3D data were not used for two reasons: the unavailability of image reconstruction algorithms and the necessity of rejecting scattered events with axial collimation. The advent of appropriate reconstruction algorithms [ 1 I] and scatter-correction algorithms [ has changed this. ]Fully 3D imaging is finding increasing use except in studies where counting rates are very high or shielding from regions just outside the field of view is required. A Physical Model If statistical effects are ignored, these factors can be incorporated into a model for the total number of recorded events to yield where kf& is the number of annihilations with photons emitted along the LOR specified by (d, 0) in Fig. 3, Pd, is the survival probability as defined in Eq. (l),? de is the number of accidental coincidences, sde is the number of scattered events, qie is the probability of detection for true events, is the probability of detection for accidental coincidences, qie is the probability of detection for scattered events, and y& is the probability of an event not being lost due to deadtime. Of the effects included in Eq. (3), attenuation is not only the most pronounced but also the most straightforward to characterize. Prior to the emission scan, a transmission scan is performed. Here a rotating line source containing a longlived isotope rotates around the subject to provide a nonzero flux of photons along each line of response. The measured data yield the number of transmitted events, T&, along each LOR. Every morning a blank scan, i.e., a transmission scan with nothing in the scanner, is performed to yield a data set, B,. The survival probability given by Eq. (1) is approximated by their ratio, Tdt! P, = -. (4) D 4. Diagram of a scattered event (left) and an accidental coincidence (right). Photons shown leaving the ring are scuttered through an oblique angle such that their paths do not intersect a detector. This estimate of survival probabilities would be exact if the data were noiseless. However, they are not noiseless, so they contribute significantly to the overall image variance unless noise-reduction algorithms are applied. These algorithms utilize smoothing [ 151, segmentation and reprojection [ 16, 171, or statistical image reconstruction and reprojection [ A simple way to estimate the accidental coincidences is to note that the arrival times of the photons due to randoms are uniformly distributed in time while those of true coincidences fall within the coincidence timing window. Collecting data in a second coincidence timing window that is offset in time such that it collects no true coincidences yields data with JANUARY 1997 IEEE SIGNAL PROCESSING MAGAZINE 47

6 nearly the same mean as that of the accidental coincidences falling in the trues timing window. The measured data are given by the product ydeqierd,, so the detector efficiencies for accidental coincidences, qiq, do not have to be estimated. Therefore, not only is the method simple to implement, but it can be performed in hardware before the data are stored. The major drawback of this approach is that the variance of the estimate is of the same order of magnitude as the variance of the data if a significant fraction of detected events are accidental coincidences. In this case, the subtraction can lead to a significant increase in the variance of the data unless noise reduction methods are used [22]. This variance increase can be avoided by counting the number of singles at each detector and using Eq. (2). Since there are many more singles than true coincidences, the effect on variance is relatively minor. This approach is not widely used because of the additional requirements placed on the acquisition hardware and because singles rates often vary over the course of an acquisition. For septa-extended scans, the fraction of scattered events is low (approximately 15% of the total number of collected events). They are usually estimated as an integral transformation of the measured data using an empirically determined kernel [23]. For fully 3D scans, the scatter fraction rises to 30-50% of the total number of events. It can be estimated using a mathematical model of the scanner and scattering process [13, 241 or by utilizing data collected in a second, lower energy window that acquires a higher fraction of scattered events [12, 141. The detector efficiencies for true and scattered events are estimated from a scan of a calibration source with known characteristics [9]. Deadtime is dependent on many factors related to the architecture and design of a specific machine, so its estimation is tailored to the scanner [25]. It is usually assumed to be constant over the duration of the scan. These parameters can be used to estimate the number of emitted photons by using the expression In this section we summarize some of the methods that have been proposed for PET image reconstruction, with a particular emphasis on those with origins in signal and image processing. This review is by no means complete and is primarily intended to describe the potential pitfalls of each approach. Most of the discussion also applies to SPECT image reconstruction. We begin by deriving a widely used linear algorithm, then we discuss pre- and post-processing techniques proposed for use with it and end with a survey of statistical image reconstruction methods. Filtered Backprojection One way to greatly (0ver)simplify the problem is to ignore the measurement noise altogether, and to assume that the measured data approximate line integrals through the radioisotope distribution. This leads to the classical filtered-backprojection (FBP) method for tomographic image reconstruction [27]. This method is used routinely for x-ray CT, as well as for PET and SPECT. Its widespread popularity stems from historical reasons of computational simplicity, not because of any widely accepted advantage in image quality. Since it is derived without any statistical information, it is unsurprising that use of the unmodified FBP method leads to unacceptable noise amplification in PET. Filtered backprojection was first applied to PET by Shepp et al. [28]. Introductory treatments of the algorithm can be found in [27] and [29] and more comprehensive treatments in [3] and [30]. The distribution of the radio-isotope is modeled by the function h(x, y,z) E L. For a given 2D slice, we assume that the mean of an individual measurement Yde is given by g,(d)= J h(x,y,z)&dydz where l(d,0) is the line i(d,b) connecting the two detectors involved in the coincidence. In practice, it is assumed that the mean g,(d) is equal to the corrected data, id,, in Eq. (5). In the rotated coordinate system of Fig. 5, d = x,, so the line integral can be expressed as where we assume that R,, = ydeq;qe[rde],sd, = ydee[sd,] and E[.] denotes expectation. The data modeled by Eq. (3) are often stored in 2D arrays with the columns indexed by d and the rows by 0. These data arrays are often called sinogrums, because, for a point source, d varies sinusoidally with 8. where xg represents transverse distance in the rotated coordinate system shown in Fig. 5. We will refer to the function g&,) (and the data it approximates) as aprojection. The Fourier transform of each projection is given by Image Reconstruction Assuming Deterministic Data The goal of image reconstruction is to recover the radiotracer concentration from the measurements. This inverse problem is not unlike the classical signal processing problem of deconvolution [26]. However, straightforward application of off-the-shelf signal processing and image restoration methods yields suboptimal results for PET image reconstruction. This result, known as the projection-slice theorem, has two implications. First, the Fourier transform of a projection 48 IEEE SIGNAL PROCESSING MAGAZINE JANUARY 1997

7 yields samples of the 2D Fourier transform of the image, and second, these samples lie along a line at the same angle, 0, in the frequency domain as that of the projection in the spatial domain. This result can be written in more standard notation as where the Fourier transform of the image is now expressed in polar coordinates (p, y). Eq. (7) can be used to reconstruct the image by constructing the Fourier transform in polar coordinates, interpolating to rectangular coordinates, and then taking the inverse transform. A more efficient method can be derived as follows. The image h(x,y) is given by Transforming to polar coordinates as shown in Fig. 5 using the expressions U = p cos0, v = p CO&, x = r cos$, and y = r sin$ yields 5. Projection geometry. Rewriting A(p cod, p sine) as A(p,0) and using the facts that cos($ - 0) = -cos($ TC) and Ge(p)= Ge+,(-p), this can be rewritten as Applying the projection-slice theorem leads to where &(x) = SV{(plG,(p)} and TI{.} denote the inverse Fourier transform. Discretizing leads to the expression Equation 13 shows that the value of the image at a point (r cos$, r sin$) in Fig. 5 can be found by first filtering the projections with a ramp filter, then summing the filtered values at the coordinate xe, = r cos(0, - $) over all projection angles 0,. Note that the value at xet will contribute to all pixels along the LORs that contributed to the measurement at this point. The algorithm can be efficiently implemented by filtering each estimated projection, g,(d) = ked, with a ramp filter to yield &(d) and then adding each filtered value into all voxels along the corresponding LOR as shown by the dashed line in Fig. 5. (Note that the discretization and the finite support of the image and projections necessitate modifications to the filter [27].) The latter operation is called backprojection, so the algorithm is unsurprisingly called filtered-hackprojection. This algorithm and il s extension to three dimensions [ll, 311 is used almost exclusively for image reconstruction in PET. It is identical to the algorithm used in x-ray CT except for modifications to tlhe filter necessitated by the noise properties of PET data. There are several problems with this algorithm. First, although the intensity is known to be non-negative, the algorithm yields negative values, particularly if the data are noisy. Second, models for the detector response must be space invariant and can only be incorporated into the algorithm as a deconvolution with the attendant noise amplification. Finally, and most importantly, the ramp filter accentuates high frequency noise. This effect can be seen by examining the magnitude spectrum of the typical and low-noise projections of the same image shown in Fig. 6. (The low-noise projection was found by reconstructing the image and reprojecting it to form an estimated projection. The variance of the noise in this estimated projection will be reduced by a factor approximately equal to the number of projections. hi this case 192 projections were used.) It is apparent that reconstructing with an unwindowed ramp filter is unwise. For frequencies above 0.8 cm-' the data are dominated by noise so the resulting images would be too noisy, as shown at the top left in Fig. 7. Moreover, in many systems, frequencies near the foldover frequency are significantly aliased and should be rejected. Therefore, the ramp is often truncated at one-half the foldover frequency as shown at the bottom in Fig. 6. The effect on the image is shown at the top right in Fig. 7. Although this image is still too noisy for visualization, it would be useful for quantitative measurements that involve averaging over a region. This window degrades the resolution from an intrinsic resolution of 5.2 mm to a reconstructed resolution of 5.6 mm FWHM. For visualization purposes, the ramp filter is often apodized with a Hanning, Parzen, or JANUARY 1997 IEEE SIGNAL PROCESSING MAGAZINE 49

8 Butterworth window. An image reconstructed with a fiftl-order Butterworth window with a cutoff frequency of 1 cm- is shown at the bottom left in Fig. 7. Examination of the image shows what appears to be a small defect in the thalamus, as shown by the arrow. This particular subject was scanned again in the fully 3D mode three minutes after the first scan, yielding the image shown at the bottom right in Fig. 7. (The fully 3D image differs from the 2D image in that more counts were collected and the image was sampled by many more LORs.) There is no evidence of the defect in this image. The apparent defect is probably due to noise at spatial frequencies near 1 cm, which are not attenuated by the Butterworth filter. In this case, filtering gives the impression of a noisefree image by reducing high-frequency noise but does not eliminate low-frequency artifacts. Concern over such issues leads naturally to the development of more sophisticated algorithms. Sinogram Preprocessing The apodization window applied to the reconstruction filter is equivalent to smoothing the projections prior to reconstruction. Although this smoothing does reduce the noise variance, it is suboptimal since PET measurement statistics are nonstationary because they follow a Poisson distribution. There have been several attempts to improve the sinogram smoothing using both iterative [ and noniterative [35] nonstationary methods. While requiring less computation than the iterative methods described below, these preprocessing methods are still suboptimal since object constraints such as non-negativity and piecewise smoothness are not naturally expressed in the sinogram domain. Image Post-processing Statistical Image Reconstruction A More Complete Model of the Data This summary of statistical reconstruction methods is condensed from [43]. The measurement statistics are quite complex, so any treatment (including ours) must make simplifying assumptions. However, many papers in the signal processing and statistics literature oversimplify the problem, e.g., [44], so we attempt to be somewhat more complete here. We modify the notation used earlier to emphasize functional dependencies. Since PET measurements are based on a counting process, a reasonable statistical model is that the measurements have independent Poisson distributions (If a deterministic finite number of nuclei are injected into the patient, then, strictly speaking, a multinomial distribution would be more precise than the Poisson assumption. However, in practice the exact number of nuclei is unknown and may well be considered a random variable with a Poisson distribution. In this case the radioactive decay will be a Poisson process; furthermore, a Poisson process thinned by Bernoulli trials remains Poisson [45], all of which leads to the Poisson model.): - Poisson{I ;(h)}, i = 1,..., n, (14) where n is the number of coincident detector pairs, h is the spatial distribution of radio-tracer (typical units are counts/s/cm3), and F(h) is the mean of the ith measurement. (Note that each i corresponds to a unique de pair in the notation used above.) The measurement means depend on the radio-tracer distribution h(x) through the physical model described above; for low to moderate counting rates, the dependence is nearly linear in h: The radiotracer distribution estimate computed by any reconstruction method is typically represented by a discrete image. This certainly invites the application of many an image processing method, both those classical (such as Wiener filtering) as well as those trendy (such as wavelets, neural nets, etc.). Unfortunately, most image processing methods are based on the (often implicit) assumption that the noise is Gaussian, or at least independent from pixel to pixel. The noise in tomographic images is generally highly correlated between neighboring pixels (since each measurement ray transects many pixels). For the (linear) FBP method the the noise correlation function can also be determined for some statistical image reconstruction methods [39-411, although the correlation functions may be expensive to compute. In our experience, classical image processing methods perform poorly for images with such correlated noise. Furthermore, the correlation structure is often nonstationary, so noise prewhitening is usually impractical. On the other hand, post-processing methods that specifically account for the correlation structure have shown some promise, e.g., [42]. where Tis the scan time, pz(~) is the (unitless, scatter-free) point-response function of the ith detector pair (p,(x) is probability that a positron emitted from a nuclei at position - x will produce a pair of annihilation photons that are detected by the ith detector pair without scattering (including geometric effects, attenuation, and detector efficiencies)), s,(h) is mean rate of detected scattered events for the ith detector, r,(h) is the mean rate of detected random coincidences for the ith detector pair, and the integral is over the scanner field of view. (For detector i indexed by de in our previous notation, c (h) = Y,rlhe and x2 (1) = YdeGfde, and b,yl is replaced by the vector 5.) Although the scatter contribution s,(h) is linear in h, the random coincidences r,(h) depend nonlinearly on h (if the detectors are not saturated, the singles rates increase monotonically with h and the randoms increase as the square of the singles rates as described by Eq. (2)). For most scanners, the singles rates required to model this dependence directly are not available, so the estimates obtained with the 50 IEEE SIGNAL PROCESSING MAGAZINE JANUARY 1997

9 delayed coincidence window [46] are used to obtain information about r,(h). For moderate counting rates, the linearity in h implied by the first term in Eq. (15) is reasonable. However, for high count rates, the measurement means are highly nonlinear functions (they are, in fact, nonmonotonic functions) of the activity in the patient due to scanner deadtime [25]. In practice, the effect of this nonlinearity is often assumed to be reducible to a single deadtime correction factor for each plane, or, more accurately, by different correction factors for different detector pairs or detector blocks. This type of correction implicitly separates the nonlinear deadtime loss from the ideal linear relationship between hand Y,. We are unaware of any attempts to include the deadtime nonlinearity directly in the forward model. We also take the separable approach here. Classical Estimation Methods Since a PET scanner collects only a finite number of measurements, one must, in general, also represent the radiotracer distribution A(&) by a finite parameterization, e.g., in terms of a set of basis functions: but this expression is impractical for computation due to the large size of the matrix A = {a,}. (The dimension of matrix A is on the order of x for a single plane of a typical scanner.) Furthermore, the conventional linear leastsquares estimate produces negative pixel values, which are physically impossible. This can be a significant problem in low activity regions of the image. Both the size of A and incorporation of the non-negativity constraint necessitate iterative algorithms. Although necessary because of existing instrumentation, the real-time correction for random coincidences using the delayed-window method renders the data non-poisson. For such measurements, estimates based on (weighted) leastsquares may be suitable [47]. (Also see [48] for more accurate approaches.) For scans that are not precorrected for randoms, the least-squares methods are suboptimal since they do not fully accommodate the Poisson distribution. (Often the number of counts per ray is sufficiently low that the Gaussian approximation to the Poisson distribution is inapplicable.) Furthermore, data-based weighted least-squares methods lead to systematic biases for low-count Poisson measurements [18, 411. This problem can be avoided by using the measurement log-likelihood L(0) rather than the weighted least-squares criterion, where, where e = [e,...op] is the vector of unknown coefficients that must be computed from the y s. (Typically b,(x) is just the indicator function for the jth voxel, so we will refer to 0, as the jth pixel value hereafter.) With such a discretization, the reconstruction problem is equivalent to a parameter estimation problem. If one assumes the scatter and random contributions are predetermined values s, and r,, respectively (i.e., if they are determined separately), and if the deadtime nonlinearity is approximated by a single known loss factor d,, then the measurement mean is linear in 0: where Unfortunately, there is no closed-form expression for the estimate GML that maximizes the likelihood, which again necessitates iterative algorithms. Unfortunately, each itera T a, U c 3 E m s Frequency in l/cm Dozens of papers have been published based on this model, most of which not only ignored the d,, r,, and s, terms, but also used very simple approximations for pi (g). The linear form above invites application of the two most common tools from statistical signal processing: maximum likelihood estimation and linear least-squares estimation. The linear least-squares estimate is easily written as iu = ( A A)~ A (~-~-~), U Frequency in l/cm 6. The magnitude spectrum of a typical projection (upper curve) and a nearly noiseless projection (bottom curve) are shown at the top. Two practical filters are shown at the bottom: a rump filter cut off at 50% of the Nyquist rate and the same filter windowed with a Butteworth filter. JANUARY 1997 IEEE SIGNAL PROCESSING MAGAZINE 51

10 tion of these algorithms requires computation time roughly comparable to that required by the FBP method. This has hampered their clinical acceptance. The oldest of these algorithms (for PET) is an expectation-maximization (EM) algorithm [49], which converges very slowly to O,,,L. This slow convergence has not greatly diminished the popularity of the EM algorithm, however, because the intermediate images generated during the iterations toward GML are usually more appealing then GML itself. (Determining which of the many iterates is the best one is nontrivial however.) The problem of determining h(5) from { Y,} is inherently ill-posed [SO], so, after parameterization, the problem of estimating 0 is generally very ill-conditioned. Thus 0, is usually extremely noisy [501. Naturally, one simple way to reduce this noise is to postsmooth GML. Such postsmoothing is a special case of the more general method of sieves [5O] and is in fact by far the most popular version of the sieve method. Postsmoothing has two disadvantages. First, in its usual form of space-invariant filtering, the nonstationarity of the measurement statistics cannot be modeled. And second, although postsmoothing reduces noise, the problem of slow convergence of the EM algorithm remains, and hundreds to thousands of EM iterations may be required for the postsmoothed images to converge [51]. This problem has spawned a variety of methods for accelerating the EM algorithm, which vary in the extent to which convergence is guaranteed (see [52,53]). Classical Regularization Methods Another way to overcome the problems of slow convergence and to reduce the image noise is to replace the log-likelihood criterion by a penalized-likelihood objective function: 0, = argm;x@(o) = L(B)-PR(B), where R(B) is a measure of image roughness. Larger values of P encourage smoother images with less noise. When first investigated for PET, the penalty function posed a computational challenge since the M-step of the EM algorithm has no closed form [ However, now there are a variety of fast algorithms (compared to EM) available for maximizing such objective functions, e.g., [47, 52, 53, 57, 581. These algorithms converge rapidly in part because the penalty function greatly improves the conditioning of the reconstruction problem. In the context of least-squares problems, such regularization methods date at least to the early 70s [59], so many may well be considered classical. The most classical penalty function simply measures the norm of the image: which has its origins in ridge-regression. This simple penalty leads to images that are squashed down since even the DC component is penalized. For reducing noise, a more suitable penalty is to discourage neighboring pixels from having disparate values: 7. A glucose metabolism image reconstructed with a ramp filter (top left), a ramp filter cut off at one-half the Nyquist frequency (top right), a ramp filter cut ojfat one-half the Nyquist frequency windowed with a fiflh order Butterworth filter with a cutofffrequency of 1 cm-l (bottom left), and data of the same subject acquired in the fully 30 mode (bottom right). The fully 30 image was reconstructed from more finely sampled data containing a higher number of counts. where N, is the set of pixel indices in the neighborhood of pixel j, and ~ (t) is a symmetric function typically chosen to be nondecreasing for t 5 0. Such penalty functions (or priors in the Bayesian terminology) have yielded good results in image restoration and image segmentation problems. However, in PET the nonstationary noise statistics again complicate the problem. Although R(0) above is a shift-invariant function, recent analysis shows that images reconstructed by have nonuniform spatial resolution, due to interactions between the log-likelihood and penalty terms [6O, 611. (Such effects are absent in image restoration problems with white Gaussian noise.) Although modified penalty functions have been proposed that reduce the resolution nonuniformity, these modifications cause more nonuniform noise variance [61, 621. Another challenge in penalized-likelihood methods is choosing p. This problem is comparable to that of choosing the width of the apodizing window in FBP or the resolution of the filter used when postsmoothing ML images. However, 52 IEEE SIGNAL PROCESSING MAGAZINE JANUARY 1997

11 in the latter two problems the parameter that one varies to tradeoff resolution and noise is one that is naturally related to spatial resolution, whereas p has essentially arbitrary units. Automatic or data-based methods for choosing p, e.g., [63, 641. have shown some potential, but may also be unstable in imaging problems [65]. There is also no consensus on the best choice for y(t). Quadratic penalties lead to oversmoothing, and nonquadratic penalties require additional parameters that must be chosen. Nonconvex penalties cause additional problems with algorithm convergence, but have led to impressive results in image restoration problems in images with sharply defined regions [66]. However, in medical images one must take care to avoid turning smooth transitions into stair steps [67]. Model Errors Nearly all papers on model-based methods for PET image reconstruction assume that the measurement model is known, particularly the system matrix A. In practice this matrix is occasionally measured, or more commonly simply computed based on an approximate geometry. In either casea contains errors, and the effect of this model mismatch on 6 is poorly understood. The errors in A might invite the application of the total least-squares (TLS) estimation method, e.g., [68]. However, TLS assumes that the errors in A are normally distributed, which is questionable in PET. Furthermore, A usually includes attenuation factors that are determined from separate noisy transmission scans. Understanding the effects of both deterministic and random errors in the model remains an important problem. Attenuation Correction As described above, the conventional attenuation correction method in PET uses the ratio of the measurements in the blank and transmission scans. The transmission measurements can be very noisy, and, with randoms subtraction, can even take on negative or zero values. This noise is usually reduced either by smoothing with a space-invariant filter [15] or by reconstructing an image of the attenuation coefficient from the line integrals in Eq. (I), segmenting it, and reprojecting it [ 16,171. However, these methods introduce bias and ignore the nonstationary statistics. More accurate attenuation correction factors can be computed by first using statistical methods to reconstruct an attenuation image while incorporating nonlinear constraints such as non-negativity and piecewise smoothness, and then reprojecting this image along all of the LORs [ SPECT Most of the above discussion also applies to SPECT imaging. Statistical methods are perhaps even more useful in SPECT than in PET for two major reasons. First, attenuation is depth dependent and cannot be precorrected [69], and second, the resolution of collimators degrades with distance [70]. Both of these effects can be incorporated directly into statistical models [71]. In fact, for SPECT cardiac studies, statistical methods are now in routine use at some centers, e.g., [72], and the EM algorithm is available commercially. Computing Speed and the Future Since computers are continually increasing in speed and memory, it might seem at first that it is only a matter of time before iterative reconstruction methods become routinely used. However, the same advances in technology that lead to faster computers also lead to bigger and harder problems! For example, although computing speed certainly lhas reached the point where iterative methods are clinically feasible for 2D problems, the focus is now on 3D PET where the size of A is times larger than in 2D (after exploiting symmetries). Similar considerations apply to cone-beam SIPECT, or even to parallel collimator SPECT with 3D compensation for detector response. Thus, there is continuing need for new ideas in image reconstruction algorithm development. Although some of those ideas will undoubtedly be borrowed from signal and image processing work, the algorithms must be based on accurate models of the physics and statistics of PET if they are to be fully effective. Convincingly demonstrating that new methods are truly more effective than previous methods requires careful matching of the resolution or noise properties of the methods compared. The medical imaging community is generally unconvinced by the type of anecdotal, single-image comparisons often found in image processing papers. There is increasing emphasis on formal statistical evaluations of different image reconstruction methods [73-751, which are also being applied to image processing [761. Conclusion The image formation process in PET lends itself well to relatively simple algorithms that yield accurate results when there are good counting statistics. Statistical methods can yield improved image quality but have not been widely adopted, largely because of their computational complexity. They play a more significant role in SPECT because they accurately incorporate models of attenuation and collimator resolution. Acknowledgment This work was supported by grants CA , CA-54362, and CA from the National Cancer Institute, and by grant number 1380 from the National Center for Research Resources. John M. Ollinger is an Assistant Professor of Biomedical Computing and Radiology at Washington University, St. Louis, Missouri , (jmo@ibc.wustl.edu). Jeflrey A. Fessler is an Assistant Professor of Electrical Engineering at JANUARY 1997 IEEE SIGNAL PROCESSING MAGAZINE 53

Medical Imaging BMEN Spring 2016

Medical Imaging BMEN Spring 2016 Name Medical Imaging BMEN 420-501 Spring 2016 Homework #4 and Nuclear Medicine Notes All questions are from the introductory Powerpoint (based on Chapter 7) and text Medical Imaging Signals and Systems,

More information

Corso di laurea in Fisica A.A Fisica Medica 5 SPECT, PET

Corso di laurea in Fisica A.A Fisica Medica 5 SPECT, PET Corso di laurea in Fisica A.A. 2007-2008 Fisica Medica 5 SPECT, PET Step 1: Inject Patient with Radioactive Drug Drug is labeled with positron (β + ) emitting radionuclide. Drug localizes

More information

Introduction to Positron Emission Tomography

Introduction to Positron Emission Tomography Planar and SPECT Cameras Summary Introduction to Positron Emission Tomography, Ph.D. Nuclear Medicine Basic Science Lectures srbowen@uw.edu System components: Collimator Detector Electronics Collimator

More information

Constructing System Matrices for SPECT Simulations and Reconstructions

Constructing System Matrices for SPECT Simulations and Reconstructions Constructing System Matrices for SPECT Simulations and Reconstructions Nirantha Balagopal April 28th, 2017 M.S. Report The University of Arizona College of Optical Sciences 1 Acknowledgement I would like

More information

REMOVAL OF THE EFFECT OF COMPTON SCATTERING IN 3-D WHOLE BODY POSITRON EMISSION TOMOGRAPHY BY MONTE CARLO

REMOVAL OF THE EFFECT OF COMPTON SCATTERING IN 3-D WHOLE BODY POSITRON EMISSION TOMOGRAPHY BY MONTE CARLO REMOVAL OF THE EFFECT OF COMPTON SCATTERING IN 3-D WHOLE BODY POSITRON EMISSION TOMOGRAPHY BY MONTE CARLO Abstract C.S. Levin, Y-C Tai, E.J. Hoffman, M. Dahlbom, T.H. Farquhar UCLA School of Medicine Division

More information

Introduction to Emission Tomography

Introduction to Emission Tomography Introduction to Emission Tomography Gamma Camera Planar Imaging Robert Miyaoka, PhD University of Washington Department of Radiology rmiyaoka@u.washington.edu Gamma Camera: - collimator - detector (crystal

More information

Implementation and evaluation of a fully 3D OS-MLEM reconstruction algorithm accounting for the PSF of the PET imaging system

Implementation and evaluation of a fully 3D OS-MLEM reconstruction algorithm accounting for the PSF of the PET imaging system Implementation and evaluation of a fully 3D OS-MLEM reconstruction algorithm accounting for the PSF of the PET imaging system 3 rd October 2008 11 th Topical Seminar on Innovative Particle and Radiation

More information

Positron Emission Tomography

Positron Emission Tomography Physics 656 Seminar on Physical Fundamentals of Medical Imaging Positron Emission Tomography Ahmed Qamesh Outline What is PET? PET mechanism Radionuclide and its synthesis Detection concept and Development

More information

Continuation Format Page

Continuation Format Page C.1 PET with submillimeter spatial resolution Figure 2 shows two views of the high resolution PET experimental setup used to acquire preliminary data [92]. The mechanics of the proposed system are similar

More information

Image Acquisition Systems

Image Acquisition Systems Image Acquisition Systems Goals and Terminology Conventional Radiography Axial Tomography Computer Axial Tomography (CAT) Magnetic Resonance Imaging (MRI) PET, SPECT Ultrasound Microscopy Imaging ITCS

More information

Digital Image Processing

Digital Image Processing Digital Image Processing SPECIAL TOPICS CT IMAGES Hamid R. Rabiee Fall 2015 What is an image? 2 Are images only about visual concepts? We ve already seen that there are other kinds of image. In this lecture

More information

Cherenkov Radiation. Doctoral Thesis. Rok Dolenec. Supervisor: Prof. Dr. Samo Korpar

Cherenkov Radiation. Doctoral Thesis. Rok Dolenec. Supervisor: Prof. Dr. Samo Korpar Doctoral Thesis Time-of-Flight Time-of-Flight Positron Positron Emission Emission Tomography Tomography Using Using Cherenkov Cherenkov Radiation Radiation Rok Dolenec Supervisor: Prof. Dr. Samo Korpar

More information

Workshop on Quantitative SPECT and PET Brain Studies January, 2013 PUCRS, Porto Alegre, Brasil Corrections in SPECT and PET

Workshop on Quantitative SPECT and PET Brain Studies January, 2013 PUCRS, Porto Alegre, Brasil Corrections in SPECT and PET Workshop on Quantitative SPECT and PET Brain Studies 14-16 January, 2013 PUCRS, Porto Alegre, Brasil Corrections in SPECT and PET Físico João Alfredo Borges, Me. Corrections in SPECT and PET SPECT and

More information

Central Slice Theorem

Central Slice Theorem Central Slice Theorem Incident X-rays y f(x,y) R x r x Detected p(, x ) The thick line is described by xcos +ysin =R Properties of Fourier Transform F [ f ( x a)] F [ f ( x)] e j 2 a Spatial Domain Spatial

More information

Image reconstruction for PET/CT scanners: past achievements and future challenges

Image reconstruction for PET/CT scanners: past achievements and future challenges Review Image reconstruction for PET/CT scanners: past achievements and future challenges PET is a medical imaging modality with proven clinical value for disease diagnosis and treatment monitoring. The

More information

Low-Dose Dual-Energy CT for PET Attenuation Correction with Statistical Sinogram Restoration

Low-Dose Dual-Energy CT for PET Attenuation Correction with Statistical Sinogram Restoration Low-Dose Dual-Energy CT for PET Attenuation Correction with Statistical Sinogram Restoration Joonki Noh, Jeffrey A. Fessler EECS Department, The University of Michigan Paul E. Kinahan Radiology Department,

More information

Emission Computed Tomography Notes

Emission Computed Tomography Notes Noll (24) ECT Notes: Page 1 Emission Computed Tomography Notes Introduction Emission computed tomography (ECT) is the CT applied to nuclear medicine. There are two varieties of ECT: 1. SPECT single-photon

More information

BME I5000: Biomedical Imaging

BME I5000: Biomedical Imaging 1 Lucas Parra, CCNY BME I5000: Biomedical Imaging Lecture 4 Computed Tomography Lucas C. Parra, parra@ccny.cuny.edu some slides inspired by lecture notes of Andreas H. Hilscher at Columbia University.

More information

Review of PET Physics. Timothy Turkington, Ph.D. Radiology and Medical Physics Duke University Durham, North Carolina, USA

Review of PET Physics. Timothy Turkington, Ph.D. Radiology and Medical Physics Duke University Durham, North Carolina, USA Review of PET Physics Timothy Turkington, Ph.D. Radiology and Medical Physics Duke University Durham, North Carolina, USA Chart of Nuclides Z (protons) N (number of neutrons) Nuclear Data Evaluation Lab.

More information

Introduction to Biomedical Imaging

Introduction to Biomedical Imaging Alejandro Frangi, PhD Computational Imaging Lab Department of Information & Communication Technology Pompeu Fabra University www.cilab.upf.edu X-ray Projection Imaging Computed Tomography Digital X-ray

More information

Positron. MillenniumVG. Emission Tomography Imaging with the. GE Medical Systems

Positron. MillenniumVG. Emission Tomography Imaging with the. GE Medical Systems Positron Emission Tomography Imaging with the MillenniumVG GE Medical Systems Table of Contents Introduction... 3 PET Imaging With Gamma Cameras PET Overview... 4 Coincidence Detection on Gamma Cameras...

More information

Identification of Shielding Material Configurations Using NMIS Imaging

Identification of Shielding Material Configurations Using NMIS Imaging Identification of Shielding Material Configurations Using NMIS Imaging B. R. Grogan, J. T. Mihalczo, S. M. McConchie, and J. A. Mullens Oak Ridge National Laboratory, P.O. Box 2008, MS-6010, Oak Ridge,

More information

MEDICAL IMAGE ANALYSIS

MEDICAL IMAGE ANALYSIS SECOND EDITION MEDICAL IMAGE ANALYSIS ATAM P. DHAWAN g, A B IEEE Engineering in Medicine and Biology Society, Sponsor IEEE Press Series in Biomedical Engineering Metin Akay, Series Editor +IEEE IEEE PRESS

More information

SPECT QA and QC. Bruce McBride St. Vincent s Hospital Sydney.

SPECT QA and QC. Bruce McBride St. Vincent s Hospital Sydney. SPECT QA and QC Bruce McBride St. Vincent s Hospital Sydney. SPECT QA and QC What is needed? Why? How often? Who says? QA and QC in Nuclear Medicine QA - collective term for all the efforts made to produce

More information

Multi-slice CT Image Reconstruction Jiang Hsieh, Ph.D.

Multi-slice CT Image Reconstruction Jiang Hsieh, Ph.D. Multi-slice CT Image Reconstruction Jiang Hsieh, Ph.D. Applied Science Laboratory, GE Healthcare Technologies 1 Image Generation Reconstruction of images from projections. textbook reconstruction advanced

More information

Improving Positron Emission Tomography Imaging with Machine Learning David Fan-Chung Hsu CS 229 Fall

Improving Positron Emission Tomography Imaging with Machine Learning David Fan-Chung Hsu CS 229 Fall Improving Positron Emission Tomography Imaging with Machine Learning David Fan-Chung Hsu (fcdh@stanford.edu), CS 229 Fall 2014-15 1. Introduction and Motivation High- resolution Positron Emission Tomography

More information

Tomographic Reconstruction

Tomographic Reconstruction Tomographic Reconstruction 3D Image Processing Torsten Möller Reading Gonzales + Woods, Chapter 5.11 2 Overview Physics History Reconstruction basic idea Radon transform Fourier-Slice theorem (Parallel-beam)

More information

Determination of Three-Dimensional Voxel Sensitivity for Two- and Three-Headed Coincidence Imaging

Determination of Three-Dimensional Voxel Sensitivity for Two- and Three-Headed Coincidence Imaging IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 50, NO. 3, JUNE 2003 405 Determination of Three-Dimensional Voxel Sensitivity for Two- and Three-Headed Coincidence Imaging Edward J. Soares, Kevin W. Germino,

More information

Iterative SPECT reconstruction with 3D detector response

Iterative SPECT reconstruction with 3D detector response Iterative SPECT reconstruction with 3D detector response Jeffrey A. Fessler and Anastasia Yendiki COMMUNICATIONS & SIGNAL PROCESSING LABORATORY Department of Electrical Engineering and Computer Science

More information

A Weighted Least Squares PET Image Reconstruction Method Using Iterative Coordinate Descent Algorithms

A Weighted Least Squares PET Image Reconstruction Method Using Iterative Coordinate Descent Algorithms A Weighted Least Squares PET Image Reconstruction Method Using Iterative Coordinate Descent Algorithms Hongqing Zhu, Huazhong Shu, Jian Zhou and Limin Luo Department of Biological Science and Medical Engineering,

More information

Introduc)on to PET Image Reconstruc)on. Tomographic Imaging. Projec)on Imaging. Types of imaging systems

Introduc)on to PET Image Reconstruc)on. Tomographic Imaging. Projec)on Imaging. Types of imaging systems Introduc)on to PET Image Reconstruc)on Adam Alessio http://faculty.washington.edu/aalessio/ Nuclear Medicine Lectures Imaging Research Laboratory Division of Nuclear Medicine University of Washington Fall

More information

Index. aliasing artifacts and noise in CT images, 200 measurement of projection data, nondiffracting

Index. aliasing artifacts and noise in CT images, 200 measurement of projection data, nondiffracting Index Algebraic equations solution by Kaczmarz method, 278 Algebraic reconstruction techniques, 283-84 sequential, 289, 293 simultaneous, 285-92 Algebraic techniques reconstruction algorithms, 275-96 Algorithms

More information

3-D PET Scatter Correction

3-D PET Scatter Correction Investigation of Accelerated Monte Carlo Techniques for PET Simulation and 3-D PET Scatter Correction C.H. Holdsworth, Student Member, IEEE, C.S. Levin", Member, IEEE, T.H. Farquhar, Student Member, IEEE,

More information

DUAL energy X-ray radiography [1] can be used to separate

DUAL energy X-ray radiography [1] can be used to separate IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 53, NO. 1, FEBRUARY 2006 133 A Scatter Correction Using Thickness Iteration in Dual-Energy Radiography S. K. Ahn, G. Cho, and H. Jeon Abstract In dual-energy

More information

3/27/2012 WHY SPECT / CT? SPECT / CT Basic Principles. Advantages of SPECT. Advantages of CT. Dr John C. Dickson, Principal Physicist UCLH

3/27/2012 WHY SPECT / CT? SPECT / CT Basic Principles. Advantages of SPECT. Advantages of CT. Dr John C. Dickson, Principal Physicist UCLH 3/27/212 Advantages of SPECT SPECT / CT Basic Principles Dr John C. Dickson, Principal Physicist UCLH Institute of Nuclear Medicine, University College London Hospitals and University College London john.dickson@uclh.nhs.uk

More information

Motion Correction in PET Image. Reconstruction

Motion Correction in PET Image. Reconstruction Motion Correction in PET Image Reconstruction Wenjia Bai Wolfson College Supervisors: Professor Sir Michael Brady FRS FREng Dr David Schottlander D.Phil. Transfer Report Michaelmas 2007 Abstract Positron

More information

An educational tool for demonstrating the TOF-PET technique

An educational tool for demonstrating the TOF-PET technique Nuclear Instruments and Methods in Physics Research A 471 (2001) 200 204 An educational tool for demonstrating the TOF-PET technique T.Bȧack a, *, J. Cederkȧall a, B. Cederwall a, A. Johnson a, A. Kerek

More information

Ch. 4 Physical Principles of CT

Ch. 4 Physical Principles of CT Ch. 4 Physical Principles of CT CLRS 408: Intro to CT Department of Radiation Sciences Review: Why CT? Solution for radiography/tomography limitations Superimposition of structures Distinguishing between

More information

Validation of GEANT4 for Accurate Modeling of 111 In SPECT Acquisition

Validation of GEANT4 for Accurate Modeling of 111 In SPECT Acquisition Validation of GEANT4 for Accurate Modeling of 111 In SPECT Acquisition Bernd Schweizer, Andreas Goedicke Philips Technology Research Laboratories, Aachen, Germany bernd.schweizer@philips.com Abstract.

More information

GPU implementation for rapid iterative image reconstruction algorithm

GPU implementation for rapid iterative image reconstruction algorithm GPU implementation for rapid iterative image reconstruction algorithm and its applications in nuclear medicine Jakub Pietrzak Krzysztof Kacperski Department of Medical Physics, Maria Skłodowska-Curie Memorial

More information

Diagnostic imaging techniques. Krasznai Zoltán. University of Debrecen Medical and Health Science Centre Department of Biophysics and Cell Biology

Diagnostic imaging techniques. Krasznai Zoltán. University of Debrecen Medical and Health Science Centre Department of Biophysics and Cell Biology Diagnostic imaging techniques Krasznai Zoltán University of Debrecen Medical and Health Science Centre Department of Biophysics and Cell Biology 1. Computer tomography (CT) 2. Gamma camera 3. Single Photon

More information

Ultrasonic Multi-Skip Tomography for Pipe Inspection

Ultrasonic Multi-Skip Tomography for Pipe Inspection 18 th World Conference on Non destructive Testing, 16-2 April 212, Durban, South Africa Ultrasonic Multi-Skip Tomography for Pipe Inspection Arno VOLKER 1, Rik VOS 1 Alan HUNTER 1 1 TNO, Stieltjesweg 1,

More information

218 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 44, NO. 2, APRIL 1997

218 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 44, NO. 2, APRIL 1997 218 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 44, NO. 2, APRIL 1997 Compton Scatter and X-ray Crosstalk and the Use of Very Thin Intercrystal Septa in High-Resolution PET Detectors Craig S. Levin, Member,

More information

CoE4TN4 Image Processing. Chapter 5 Image Restoration and Reconstruction

CoE4TN4 Image Processing. Chapter 5 Image Restoration and Reconstruction CoE4TN4 Image Processing Chapter 5 Image Restoration and Reconstruction Image Restoration Similar to image enhancement, the ultimate goal of restoration techniques is to improve an image Restoration: a

More information

DUE to beam polychromacity in CT and the energy dependence

DUE to beam polychromacity in CT and the energy dependence 1 Empirical Water Precorrection for Cone-Beam Computed Tomography Katia Sourbelle, Marc Kachelrieß, Member, IEEE, and Willi A. Kalender Abstract We propose an algorithm to correct for the cupping artifact

More information

Basics of treatment planning II

Basics of treatment planning II Basics of treatment planning II Sastry Vedam PhD DABR Introduction to Medical Physics III: Therapy Spring 2015 Monte Carlo Methods 1 Monte Carlo! Most accurate at predicting dose distributions! Based on

More information

Attenuation map reconstruction from TOF PET data

Attenuation map reconstruction from TOF PET data Attenuation map reconstruction from TOF PET data Qingsong Yang, Wenxiang Cong, Ge Wang* Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY 80, USA *Ge Wang (ge-wang@ieee.org)

More information

ISOCS Characterization of Sodium Iodide Detectors for Gamma-Ray Spectrometry

ISOCS Characterization of Sodium Iodide Detectors for Gamma-Ray Spectrometry ISOCS Characterization of Sodium Iodide Detectors for Gamma-Ray Spectrometry Sasha A. Philips, Frazier Bronson, Ram Venkataraman, Brian M. Young Abstract--Activity measurements require knowledge of the

More information

Fast Timing and TOF in PET Medical Imaging

Fast Timing and TOF in PET Medical Imaging Fast Timing and TOF in PET Medical Imaging William W. Moses Lawrence Berkeley National Laboratory October 15, 2008 Outline: Time-of-Flight PET History Present Status Future This work was supported in part

More information

CHAPTER 11 NUCLEAR MEDICINE IMAGING DEVICES

CHAPTER 11 NUCLEAR MEDICINE IMAGING DEVICES CHAPTER 11 M.A. LODGE, E.C. FREY Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins University, Baltimore, Maryland, United States of America 11.1. INTRODUCTION Imaging

More information

Conflicts of Interest Nuclear Medicine and PET physics reviewer for the ACR Accreditation program

Conflicts of Interest Nuclear Medicine and PET physics reviewer for the ACR Accreditation program James R Halama, PhD Loyola University Medical Center Conflicts of Interest Nuclear Medicine and PET physics reviewer for the ACR Accreditation program Learning Objectives 1. Be familiar with recommendations

More information

A Comparison of the Uniformity Requirements for SPECT Image Reconstruction Using FBP and OSEM Techniques

A Comparison of the Uniformity Requirements for SPECT Image Reconstruction Using FBP and OSEM Techniques IMAGING A Comparison of the Uniformity Requirements for SPECT Image Reconstruction Using FBP and OSEM Techniques Lai K. Leong, Randall L. Kruger, and Michael K. O Connor Section of Nuclear Medicine, Department

More information

Unmatched Projector/Backprojector Pairs in an Iterative Reconstruction Algorithm

Unmatched Projector/Backprojector Pairs in an Iterative Reconstruction Algorithm 548 IEEE TRANSACTIONS ON MEDICAL IMAGING, VOL. 19, NO. 5, MAY 2000 Unmatched Projector/Backprojector Pairs in an Iterative Reconstruction Algorithm Gengsheng L. Zeng*, Member, IEEE, and Grant T. Gullberg,

More information

UNIVERSITY OF SOUTHAMPTON

UNIVERSITY OF SOUTHAMPTON UNIVERSITY OF SOUTHAMPTON PHYS2007W1 SEMESTER 2 EXAMINATION 2014-2015 MEDICAL PHYSICS Duration: 120 MINS (2 hours) This paper contains 10 questions. Answer all questions in Section A and only two questions

More information

Artifact Mitigation in High Energy CT via Monte Carlo Simulation

Artifact Mitigation in High Energy CT via Monte Carlo Simulation PIERS ONLINE, VOL. 7, NO. 8, 11 791 Artifact Mitigation in High Energy CT via Monte Carlo Simulation Xuemin Jin and Robert Y. Levine Spectral Sciences, Inc., USA Abstract The high energy (< 15 MeV) incident

More information

Adaptive Waveform Inversion: Theory Mike Warner*, Imperial College London, and Lluís Guasch, Sub Salt Solutions Limited

Adaptive Waveform Inversion: Theory Mike Warner*, Imperial College London, and Lluís Guasch, Sub Salt Solutions Limited Adaptive Waveform Inversion: Theory Mike Warner*, Imperial College London, and Lluís Guasch, Sub Salt Solutions Limited Summary We present a new method for performing full-waveform inversion that appears

More information

COUNT RATE AND SPATIAL RESOLUTION PERFORMANCE OF A 3-DIMENSIONAL DEDICATED POSITRON EMISSION TOMOGRAPHY (PET) SCANNER

COUNT RATE AND SPATIAL RESOLUTION PERFORMANCE OF A 3-DIMENSIONAL DEDICATED POSITRON EMISSION TOMOGRAPHY (PET) SCANNER COUNT RATE AND SPATIAL RESOLUTION PERFORMANCE OF A 3-DIMENSIONAL DEDICATED POSITRON EMISSION TOMOGRAPHY (PET) SCANNER By RAMI RIMON ABU-AITA A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY

More information

Algebraic Iterative Methods for Computed Tomography

Algebraic Iterative Methods for Computed Tomography Algebraic Iterative Methods for Computed Tomography Per Christian Hansen DTU Compute Department of Applied Mathematics and Computer Science Technical University of Denmark Per Christian Hansen Algebraic

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION SUPPLEMENTARY INFORMATION doi:10.1038/nature10934 Supplementary Methods Mathematical implementation of the EST method. The EST method begins with padding each projection with zeros (that is, embedding

More information

A Fast GPU-Based Approach to Branchless Distance-Driven Projection and Back-Projection in Cone Beam CT

A Fast GPU-Based Approach to Branchless Distance-Driven Projection and Back-Projection in Cone Beam CT A Fast GPU-Based Approach to Branchless Distance-Driven Projection and Back-Projection in Cone Beam CT Daniel Schlifske ab and Henry Medeiros a a Marquette University, 1250 W Wisconsin Ave, Milwaukee,

More information

FRONT-END DATA PROCESSING OF NEW POSITRON EMIS- SION TOMOGRAPHY DEMONSTRATOR

FRONT-END DATA PROCESSING OF NEW POSITRON EMIS- SION TOMOGRAPHY DEMONSTRATOR SOUDABEH MORADI FRONT-END DATA PROCESSING OF NEW POSITRON EMIS- SION TOMOGRAPHY DEMONSTRATOR Master of Science Thesis Examiners: Prof. Ulla Ruotsalainen MSc Defne Us Examiners and topic approved by the

More information

STATISTICAL positron emission tomography (PET) image

STATISTICAL positron emission tomography (PET) image IEEE TRANSACTIONS ON MEDICAL IMAGING, VOL. 23, NO. 9, SEPTEMBER 2004 1057 Accurate Estimation of the Fisher Information Matrix for the PET Image Reconstruction Problem Quanzheng Li, Student Member, IEEE,

More information

3 Nonlinear Regression

3 Nonlinear Regression CSC 4 / CSC D / CSC C 3 Sometimes linear models are not sufficient to capture the real-world phenomena, and thus nonlinear models are necessary. In regression, all such models will have the same basic

More information

3 Nonlinear Regression

3 Nonlinear Regression 3 Linear models are often insufficient to capture the real-world phenomena. That is, the relation between the inputs and the outputs we want to be able to predict are not linear. As a consequence, nonlinear

More information

Classification of Hyperspectral Breast Images for Cancer Detection. Sander Parawira December 4, 2009

Classification of Hyperspectral Breast Images for Cancer Detection. Sander Parawira December 4, 2009 1 Introduction Classification of Hyperspectral Breast Images for Cancer Detection Sander Parawira December 4, 2009 parawira@stanford.edu In 2009 approximately one out of eight women has breast cancer.

More information

Image Processing and Analysis

Image Processing and Analysis Image Processing and Analysis 3 stages: Image Restoration - correcting errors and distortion. Warping and correcting systematic distortion related to viewing geometry Correcting "drop outs", striping and

More information

Enhanced material contrast by dual-energy microct imaging

Enhanced material contrast by dual-energy microct imaging Enhanced material contrast by dual-energy microct imaging Method note Page 1 of 12 2 Method note: Dual-energy microct analysis 1. Introduction 1.1. The basis for dual energy imaging Micro-computed tomography

More information

Translational Computed Tomography: A New Data Acquisition Scheme

Translational Computed Tomography: A New Data Acquisition Scheme 2nd International Symposium on NDT in Aerospace 2010 - We.1.A.3 Translational Computed Tomography: A New Data Acquisition Scheme Theobald FUCHS 1, Tobias SCHÖN 2, Randolf HANKE 3 1 Fraunhofer Development

More information

Application of MCNP Code in Shielding Design for Radioactive Sources

Application of MCNP Code in Shielding Design for Radioactive Sources Application of MCNP Code in Shielding Design for Radioactive Sources Ibrahim A. Alrammah Abstract This paper presents three tasks: Task 1 explores: the detected number of as a function of polythene moderator

More information

Nuclear Medicine Imaging

Nuclear Medicine Imaging Introduction to Medical Engineering (Medical Imaging) Suetens 5 Nuclear Medicine Imaging Ho Kyung Kim Pusan National University Introduction Use of radioactive isotopes for medical purposes since 1920

More information

Development and Performance of a Sparsity- Exploiting Algorithm for Few-View Single Photon Emission Computed Tomogrpahy (SPECT) Reconstruction

Development and Performance of a Sparsity- Exploiting Algorithm for Few-View Single Photon Emission Computed Tomogrpahy (SPECT) Reconstruction Marquette University e-publications@marquette Master's Theses (2009 -) Dissertations, Theses, and Professional Projects Development and Performance of a Sparsity- Exploiting Algorithm for Few-View Single

More information

Scatter Correction Methods in Dimensional CT

Scatter Correction Methods in Dimensional CT Scatter Correction Methods in Dimensional CT Matthias Baer 1,2, Michael Hammer 3, Michael Knaup 1, Ingomar Schmidt 3, Ralf Christoph 3, Marc Kachelrieß 2 1 Institute of Medical Physics, Friedrich-Alexander-University

More information

Dynamic Reconstruction for Coded Aperture Imaging Draft Unpublished work please do not cite or distribute.

Dynamic Reconstruction for Coded Aperture Imaging Draft Unpublished work please do not cite or distribute. Dynamic Reconstruction for Coded Aperture Imaging Draft 1.0.1 Berthold K.P. Horn 2007 September 30. Unpublished work please do not cite or distribute. The dynamic reconstruction technique makes it possible

More information

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT. Vivekananda Collegee of Engineering & Technology Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT Dept. Prepared by Harivinod N Assistant Professor, of Computer Science and Engineering,

More information

Bias-Variance Tradeos Analysis Using Uniform CR Bound. Mohammad Usman, Alfred O. Hero, Jerey A. Fessler and W. L. Rogers. University of Michigan

Bias-Variance Tradeos Analysis Using Uniform CR Bound. Mohammad Usman, Alfred O. Hero, Jerey A. Fessler and W. L. Rogers. University of Michigan Bias-Variance Tradeos Analysis Using Uniform CR Bound Mohammad Usman, Alfred O. Hero, Jerey A. Fessler and W. L. Rogers University of Michigan ABSTRACT We quantify fundamental bias-variance tradeos for

More information

SNIC Symposium, Stanford, California April The Hybrid Parallel Plates Gas Counter for Medical Imaging

SNIC Symposium, Stanford, California April The Hybrid Parallel Plates Gas Counter for Medical Imaging The Hybrid Parallel Plates Gas Counter for Medical Imaging F. Anulli, G. Bencivenni, C. D Ambrosio, D. Domenici, G. Felici, F. Murtas Laboratori Nazionali di Frascati - INFN, Via E. Fermi 40, I-00044 Frascati,

More information

Image Restoration and Reconstruction

Image Restoration and Reconstruction Image Restoration and Reconstruction Image restoration Objective process to improve an image, as opposed to the subjective process of image enhancement Enhancement uses heuristics to improve the image

More information

in PET Medical Imaging

in PET Medical Imaging Fast Timing and TOF in PET Medical Imaging William W. Moses Lawrence Berkeley National Laboratory October 15, 2008 Outline: Time-of-Flight PET History Present Status Future This work was supported in part

More information

Empirical cupping correction: A first-order raw data precorrection for cone-beam computed tomography

Empirical cupping correction: A first-order raw data precorrection for cone-beam computed tomography Empirical cupping correction: A first-order raw data precorrection for cone-beam computed tomography Marc Kachelrieß, a Katia Sourbelle, and Willi A. Kalender Institute of Medical Physics, University of

More information

Basics of treatment planning II

Basics of treatment planning II Basics of treatment planning II Sastry Vedam PhD DABR Introduction to Medical Physics III: Therapy Spring 2015 Dose calculation algorithms! Correction based! Model based 1 Dose calculation algorithms!

More information

Tomographic Image Reconstruction in Noisy and Limited Data Settings.

Tomographic Image Reconstruction in Noisy and Limited Data Settings. Tomographic Image Reconstruction in Noisy and Limited Data Settings. Syed Tabish Abbas International Institute of Information Technology, Hyderabad syed.abbas@research.iiit.ac.in July 1, 2016 Tabish (IIIT-H)

More information

Spiral ASSR Std p = 1.0. Spiral EPBP Std. 256 slices (0/300) Kachelrieß et al., Med. Phys. 31(6): , 2004

Spiral ASSR Std p = 1.0. Spiral EPBP Std. 256 slices (0/300) Kachelrieß et al., Med. Phys. 31(6): , 2004 Spiral ASSR Std p = 1.0 Spiral EPBP Std p = 1.0 Kachelrieß et al., Med. Phys. 31(6): 1623-1641, 2004 256 slices (0/300) Advantages of Cone-Beam Spiral CT Image quality nearly independent of pitch Increase

More information

Advanced Image Reconstruction Methods for Photoacoustic Tomography

Advanced Image Reconstruction Methods for Photoacoustic Tomography Advanced Image Reconstruction Methods for Photoacoustic Tomography Mark A. Anastasio, Kun Wang, and Robert Schoonover Department of Biomedical Engineering Washington University in St. Louis 1 Outline Photoacoustic/thermoacoustic

More information

Assessment of OSEM & FBP Reconstruction Techniques in Single Photon Emission Computed Tomography Using SPECT Phantom as Applied on Bone Scintigraphy

Assessment of OSEM & FBP Reconstruction Techniques in Single Photon Emission Computed Tomography Using SPECT Phantom as Applied on Bone Scintigraphy Assessment of OSEM & FBP Reconstruction Techniques in Single Photon Emission Computed Tomography Using SPECT Phantom as Applied on Bone Scintigraphy Physics Department, Faculty of Applied Science,Umm Al-Qura

More information

The Design and Implementation of COSEM, an Iterative Algorithm for Fully 3-D Listmode Data

The Design and Implementation of COSEM, an Iterative Algorithm for Fully 3-D Listmode Data IEEE TRANSACTIONS ON MEDICAL IMAGING, VOL. 20, NO. 7, JULY 2001 633 The Design and Implementation of COSEM, an Iterative Algorithm for Fully 3-D Listmode Data Ron Levkovitz, Dmitry Falikman*, Michael Zibulevsky,

More information

Modeling and Incorporation of System Response Functions in 3D Whole Body PET

Modeling and Incorporation of System Response Functions in 3D Whole Body PET Modeling and Incorporation of System Response Functions in 3D Whole Body PET Adam M. Alessio, Member IEEE, Paul E. Kinahan, Senior Member IEEE, and Thomas K. Lewellen, Senior Member IEEE University of

More information

C a t p h a n / T h e P h a n t o m L a b o r a t o r y

C a t p h a n / T h e P h a n t o m L a b o r a t o r y C a t p h a n 5 0 0 / 6 0 0 T h e P h a n t o m L a b o r a t o r y C a t p h a n 5 0 0 / 6 0 0 Internationally recognized for measuring the maximum obtainable performance of axial, spiral and multi-slice

More information

Computational Medical Imaging Analysis

Computational Medical Imaging Analysis Computational Medical Imaging Analysis Chapter 2: Image Acquisition Systems Jun Zhang Laboratory for Computational Medical Imaging & Data Analysis Department of Computer Science University of Kentucky

More information

NONLINEAR BACK PROJECTION FOR TOMOGRAPHIC IMAGE RECONSTRUCTION

NONLINEAR BACK PROJECTION FOR TOMOGRAPHIC IMAGE RECONSTRUCTION NONLINEAR BACK PROJECTION FOR TOMOGRAPHIC IMAGE RECONSTRUCTION Ken Sauef and Charles A. Bournant *Department of Electrical Engineering, University of Notre Dame Notre Dame, IN 46556, (219) 631-6999 tschoo1

More information

Image Restoration and Reconstruction

Image Restoration and Reconstruction Image Restoration and Reconstruction Image restoration Objective process to improve an image Recover an image by using a priori knowledge of degradation phenomenon Exemplified by removal of blur by deblurring

More information

IN THIS PAPER we consider the solution of ill-posed

IN THIS PAPER we consider the solution of ill-posed IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 6, NO. 3, MARCH 1997 463 Tomographic Reconstruction and Estimation Based on Multiscale Natural-Pixel Bases Mickey Bhatia, William C. Karl, Member, IEEE, and

More information

Improvement of Efficiency and Flexibility in Multi-slice Helical CT

Improvement of Efficiency and Flexibility in Multi-slice Helical CT J. Shanghai Jiaotong Univ. (Sci.), 2008, 13(4): 408 412 DOI: 10.1007/s12204-008-0408-x Improvement of Efficiency and Flexibility in Multi-slice Helical CT SUN Wen-wu 1 ( ), CHEN Si-ping 2 ( ), ZHUANG Tian-ge

More information

Characterization of a Time-of-Flight PET Scanner based on Lanthanum Bromide

Characterization of a Time-of-Flight PET Scanner based on Lanthanum Bromide 2005 IEEE Nuclear Science Symposium Conference Record M04-8 Characterization of a Time-of-Flight PET Scanner based on Lanthanum Bromide J. S. Karp, Senior Member, IEEE, A. Kuhn, Member, IEEE, A. E. Perkins,

More information

Engineered Diffusers Intensity vs Irradiance

Engineered Diffusers Intensity vs Irradiance Engineered Diffusers Intensity vs Irradiance Engineered Diffusers are specified by their divergence angle and intensity profile. The divergence angle usually is given as the width of the intensity distribution

More information

Computed Tomography. Principles, Design, Artifacts, and Recent Advances. Jiang Hsieh THIRD EDITION. SPIE PRESS Bellingham, Washington USA

Computed Tomography. Principles, Design, Artifacts, and Recent Advances. Jiang Hsieh THIRD EDITION. SPIE PRESS Bellingham, Washington USA Computed Tomography Principles, Design, Artifacts, and Recent Advances THIRD EDITION Jiang Hsieh SPIE PRESS Bellingham, Washington USA Table of Contents Preface Nomenclature and Abbreviations xi xv 1 Introduction

More information

(RMSE). Reconstructions showed that modeling the incremental blur improved the resolution of the attenuation map and quantitative accuracy.

(RMSE). Reconstructions showed that modeling the incremental blur improved the resolution of the attenuation map and quantitative accuracy. Modeling the Distance-Dependent Blurring in Transmission Imaging in the Ordered-Subset Transmission (OSTR) Algorithm by Using an Unmatched Projector/Backprojector Pair B. Feng, Member, IEEE, M. A. King,

More information

Deviceless respiratory motion correction in PET imaging exploring the potential of novel data driven strategies

Deviceless respiratory motion correction in PET imaging exploring the potential of novel data driven strategies g Deviceless respiratory motion correction in PET imaging exploring the potential of novel data driven strategies Presented by Adam Kesner, Ph.D., DABR Assistant Professor, Division of Radiological Sciences,

More information

Performance Evaluation of radionuclide imaging systems

Performance Evaluation of radionuclide imaging systems Performance Evaluation of radionuclide imaging systems Nicolas A. Karakatsanis STIR Users meeting IEEE Nuclear Science Symposium and Medical Imaging Conference 2009 Orlando, FL, USA Geant4 Application

More information

SPECT: Physics Principles and Equipment Design

SPECT: Physics Principles and Equipment Design SPECT: Physics Principles and Equipment Design Eric C. Frey, Ph.D., Professor Division of Medical Imaging Physics Russell H. Morgan Department of Radiology and Radiological Science Disclosures Johns Hopkins

More information

Single-particle electron microscopy (cryo-electron microscopy) CS/CME/BioE/Biophys/BMI 279 Nov. 16 and 28, 2017 Ron Dror

Single-particle electron microscopy (cryo-electron microscopy) CS/CME/BioE/Biophys/BMI 279 Nov. 16 and 28, 2017 Ron Dror Single-particle electron microscopy (cryo-electron microscopy) CS/CME/BioE/Biophys/BMI 279 Nov. 16 and 28, 2017 Ron Dror 1 Last month s Nobel Prize in Chemistry Awarded to Jacques Dubochet, Joachim Frank

More information