Voxel dataset noise reduction algorithm based on CMM reference measurements for geometrical tolerance analysis of prismatic test pieces

Similar documents
CT Measurement Realization Process: The Need for Advanced Correction Methods

Frequency-based method to optimize the number of projections for industrial computed tomography

Parameter Dependent Thresholding for Dimensional X-ray Computed Tomography

THREE DIMENSIONAL INDUSTRIAL METROLOGY USING X-RAY COMPUTED TOMOGRAPHY ON COMPOSITE MATERIALS

INVESTIGATION OF MEASURING STRATEGIES IN COMPUTED TOMOGRAPHY

Multi-spectral (W, Mo, Cu, and Ag) XCT measurements

Investigation of the kinematic system of a 450 kv CT scanner and its influence on dimensional CT metrology applications

An initial experimental study on the influence of beam hardening in X-ray CT for dimensional metrology

More Info at Open Access Database

Registration concepts for the just-in-time artefact correction by means of virtual computed tomography

HIGH RESOLUTION COMPUTED TOMOGRAPHY FOR METROLOGY

Computed Tomography & 3D Metrology Application of the VDI/VDE Directive 2630 and Optimization of the CT system

Investigating the influence of workpiece placement on the uncertainty of measurements in industrial computed tomography

INFLUENCE OF GEOMETRICAL MAGNIFICATION ON COMPUTED TOMOGRAPHY DIMENSIONAL MEASUREMENTS

Simulation-based correction of systematic errors for CT measurements

ASSESSMENT OF MEASUREMENT UNCERTAINTY CAUSED IN THE PREPARATION OF MEASUREMENTS USING COMPUTED TOMOGRAPHY

Comparison of Probing Error in Dimensional Measurement by Means of 3D Computed Tomography with Circular and Helical Sampling

Investigation on the influence of image quality in X-ray CT metrology

Developments in Dimensional Metrology in X-ray Computed Tomography at NPL

Industrial Computer Tomography for Dimensional Metrology: Overview of Influence Factors and Improvement Strategies

Characterization of microshells experimented on Laser Megajoule using X-Ray tomography

How beam hardening influences the dimensional geometry in X-ray computed tomography

Wieblinger Weg 92a, Heidelberg, Germany, Phone: , Fax: ;

Efficient 3D Crease Point Extraction from 2D Crease Pixels of Sinogram

Experimental investigation of new multi-material gap reference standard for testing computed tomography systems

Creating a Multi-material Probing Error Test for the Acceptance Testing of Dimensional Computed Tomography Systems

Digital Volume Correlation for Materials Characterization

Coordinate Measuring Machines with Computed Tomography

Use of reference objects for correction of measuring errors in X-ray computed tomography

On the impact of probing form error on form measurement in Computed Tomography

RELIABILITY OF PARAMETRIC ERROR ON CALIBRATION OF CMM

The Virtual CMM a software tool for uncertainty evaluation practical application in an accredited calibration lab

Thickness Measurement of Metal Plate Using CT Projection Images and Nominal Shape

HIGH-SPEED THEE-DIMENSIONAL TOMOGRAPHIC IMAGING OF FRAGMENTS AND PRECISE STATISTICS FROM AN AUTOMATED ANALYSIS

Japan Foundry Society, Inc. Application of Recent X-ray CT Technology to Investment Casting field. Kouichi Inagaki ICC / IHI Corporation

Good practice guide for verification of area-scanning on-board metrology systems on machine tools

PERFORMANCE ASSESSMENT OF GEOMETRY MEASUREMENTS WITH MICRO-CT USING A DISMOUNTABLE WORK-PIECE-NEAR REFERENCE STANDARD

Plane Wave Imaging Using Phased Array Arno Volker 1

CT Systems and their standards

COMPARISON OF ERROR MAPPING TECHNIQUES FOR COORDINATE MEASURING MACHINES USING THE PLATE METHOD AND LASER TRACER TECHNIQUE

A cone-beam CT geometry correction method based on intentional misalignments to render the projection images correctable

Novel evaluation method of low contrast resolution performance of dimensional X-ray CT

MULTIPLE-SENSOR INTEGRATION FOR EFFICIENT REVERSE ENGINEERING OF GEOMETRY

Optiv CT160 Accurate measurement for every detail. Computed Tomography System

Traceable 3D X-Ray Measurements ZEISS METROTOM

Experimental accuracy assessment of different measuring sensors on workpieces with varying properties

Uncertainty Evaluation of Pore Analysis for Additively Manufactured Parts using Cross Sections

Adapted acquisition trajectory and iterative reconstruction for few-views CT inspection

A new Concept for High-Speed atline and inlinect for up to 100% Mass Production Process Control

Radiographic Simulator artist: Version 2

Comparison of Reconstruction Methods for Computed Tomography with Industrial Robots using Automatic Object Position Recognition

A new Concept for High-Speed atline and inlinect for up to 100% Mass Production Process Control

Analysis of Combined Probing Measurement Error and Length Measurement Error Test for Acceptance Testing in Dimensional Computed Tomography

Evaluation strategies in CT scanning

CT Reconstruction with Good-Orientation and Layer Separation for Multilayer Objects

DEVELOPMENT OF CONE BEAM TOMOGRAPHIC RECONSTRUCTION SOFTWARE MODULE

ADS40 Calibration & Verification Process. Udo Tempelmann*, Ludger Hinsken**, Utz Recke*

Advanced Reconstruction Techniques Applied to an On-Site CT System

CT for Industrial Metrology Accuracy and Structural Resolution of CT Dimensional Measurements

Case Studies in the Use of Computed Tomography for Non-Destructive Testing, Inspection and Measurement

Separation of CT Nominal-Actual Comparisons for Compensation of Plastic Injection Molds

Creating a Multi-Material Length Measurement Error Test for the Acceptance Testing of Dimensional Computed Tomography Systems

Quality control phantoms and protocol for a tomography system

Combining Analytical and Monte Carlo Modelling for Industrial Radiology

The PolyCT Increasing the sample throughput using multiple rotation axes

Ultrasonic Multi-Skip Tomography for Pipe Inspection

Design and performance characteristics of a Cone Beam CT system for Leksell Gamma Knife Icon

Ch 22 Inspection Technologies

INSPECTION OF THE TURBINE BLADES USING SCANNING TECHNIQUES

Case Studies in Nondestructive Testing and Evaluation

MULTI-PURPOSE 3D COMPUTED TOMOGRAPHY SYSTEM

AM INSPECTION / NDT: IS XCT THE SOLUTION?

A PRACTICAL ALGORITHM FOR RECONSTRUCTION FROM X-RAY

Position accuracy analysis of the stereotactic reference defined by the CBCT on Leksell Gamma Knife Icon

EMRP JRP IND62 TIM: Use of on-board metrology systems for area-scanning on machine tools

Digital Laminography and Computed Tomography with 600 kv for Aerospace Applications

Arion: a realistic projection simulator for optimizing laboratory and industrial micro-ct

A Procedure for accuracy Investigation of Terrestrial Laser Scanners

2. Combination of eddy current inspection and imaging analysis

X-rays see all X-ray computed microtomography (µct) a novel technique available at CERN for material and metrological inspection

A RADIAL WHITE LIGHT INTERFEROMETER FOR MEASUREMENT OF CYLINDRICAL GEOMETRIES

DUE to beam polychromacity in CT and the energy dependence

Available online at ScienceDirect

Translational Computed Tomography: A New Data Acquisition Scheme

AN EVALUATION OF NON-CONTACT LASER SCANNER PERFORMANCE USING CALIBRATED ARTIFACTS

METHODS FOR PERFORMANCE EVALUATION OF SINGLE AXIS POSITIONING SYSTEMS: POINT REPEATABILITY

A new concept for high-speed atline and inline CT for up to 100% mass production process control allowing both 3D metrology and failure analysis

Pipe Inspection with Digital Radiography

PARAMETERS INFLUENCING THE PRECISION OF SLM PRODUCTION

ScienceDirect. Comparison of Different Method of Measurement Geometry using CMM, Optical Scanner and Computed Tomography 3D

ULTRASONIC TESTING AND FLAW CHARACTERIZATION. Alex KARPELSON Kinectrics Inc., Toronto, Canada

Optimization of scanner parameters for dual energy micro-ct

Good Practice guide to measure roundness on roller machines and to estimate their uncertainty

Comparison between 3D Digital and Optical Microscopes for the Surface Measurement using Image Processing Techniques

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene

Distance measurements on inaccessible wires in plastic parts with X-ray computed tomography

Form evaluation algorithms in coordinate metrology

INDUSTRIAL SYSTEM DEVELOPMENT FOR VOLUMETRIC INTEGRITY

Computer aided error analysis for a threedimensional precision surface mapping system

Advanced Computed Tomography System for the Inspection of Large Aluminium Car Bodies

Transcription:

Voxel dataset noise reduction algorithm based on CMM reference measurements for geometrical tolerance analysis of prismatic test pieces More info about this article: http://www.ndt.net/?id=21929 Crhistian R Baldo 1, Thiago L Fernandes 2, Gustavo D Donatelli 2, Kostadin Doytchinov 3, Kostadin Vrantzaliev 3 1 Federal University of ABC - UFABC, Center for Engineering, Modeling and Applied Social Sciences, Santo André, Brazil e-mail: crhistian.baldo@ufabc.edu.br 2 Reference Centers in Innovative Technologies - CERTI, Center for Metrology and Instrumentation, Florianópolis, Brazil e-mail: thf@certi.org.br, gd@certi.org.br 3 Quality Vision International - QVI, KOTEM Hungary Ltd., Budapest H-1119, Hungary e-mail: kostadin@kotem.com, kostadin.vrantzaliev@kotem.com Abstract Computed tomography for dimensional metrology has been introduced in the quality control loop for about a decade. To make the tolerance verification consistent, GD&T evaluation programs have been developed, which make use of CT point clouds generated through segmentation. This important step in a CT measurement sequence is investigated under the perspectives of tolerance analysis and measurement uncertainty. Based on empirical master part reference measurements, voxel dataset noise can be rationally filtered out by applying point-based reduction algorithms. Their significance on dimensional and geometrical characteristics measurements is experimentally tested and discussed in this work. Keywords: industrial tomography, measurement uncertainty, point cloud generation, geometrical tolerance verification 1 Introduction Computed tomography (CT) for dimensional metrology has been gradually included into the quality control loop of industries; the possibility of fully inspecting complex-shaped objects without damaging it or imposing sensor accessibility restrictions for tiny or hidden features is the major differential of the technology. CT relies on the attenuation of X-rays when propagating through the test object. For different beam directions, the intensity distribution of the remaining radiation is detected and stored as a grey-value image. The resulting planar images related to the complete test object rotation are mathematically processed to construct the voxel dataset. Posterior segmentation of the voxel dataset allows distinguish the edges of the test object; finally, dimensional analyses can be performed [1-2]. Typical design of metrological CT scanners comprises the X-ray cone-beam source at one end and the flat panel detector at the other end. Between them, a turntable for stepwise rotation of the object is placed, which can be linearly moved along the magnification axis. Figure 1 illustrates the design of a typical metrological CT system. Because of the complex CT measurement chain, many influencing quantities and factors may influence the results of CT-based measurements. They may be related to the X-ray source (e.g. photon energy, focal spot size), detector (e.g. sensitivity, pixel size, exposure time), test object (e.g. material, geometry), kinematics (e.g. magnification axis and turntable accuracy), and image and mathematical further processing (e.g. image filtering, artifact reduction, edge detection, tolerance analysis) [2]. Figure 1: Typical design of a metrological CT system and the influence quantities and factors related to each CT subsystem. www.3dct.at 1

This complicated measurement error behavior has made difficult estimating the measurement uncertainty and thus tracing the CT measurement results back to the unit of length. In fact, recent works have addressed the difficulties in suitably reporting the uncertainty associated with CT-based measurement results [3], the use of the experimental method described in ISO 15530-3 to determine the uncertainty [4-9], the development of virtual CT simulation platforms to compute the uncertainty [10], and the effect of varying different CT influence quantities and factors on the measurement result [11-18]. 1.1 Subject matter In 1993, Brown [19] defined methods divergence as the realization that dissimilar measuring methods yield different measured results for the same feature. To make the tolerance verification consistent and nearly independent of metrologist decisions, GD&T evaluation programs have been offered in recent years. They make use of point clouds from test object measurements performed on any measurement device, merge the points with the corresponding nominal model, and automatically evaluate the tolerance callouts in compliance with GD&T standards such as ASME Y14.5 and ISO 1101. For CT metrology, the typical input data of standalone GD&T evaluation programs is the point cloud, which is obtained from the computed surface through the segmentation process with sub-voxel accuracy. The extracted surface is then represented by a set of points (coordinates) which can be connected to each other to form a triangle mesh (STL). This important stage in a CT measurement procedure is explored in this paper under the tolerance verification and measurement uncertainty views. Based on empirical master part reference measurements, data reduction methods for rationally extracting CT point cloud information are presented. Their effect on dimensional / geometrical characteristics measurements is experimentally tested and discussed. 2 Point-based compensation approach Recent researches have shown the dominant effect of the measurement bias on the uncertainty of dimensional CT results [7-9]. Influencing factors such as image artifacts and image noise cause problems in threshold surface determination (edge detection), directly impacting the measurement bias and jeopardizing further tolerance compliance analyses. To overcome this uncertainty matter in dimensional CT measurements, a point-based compensation approach has been formulated. The conceptual basis of the approach is summarized in this section. First, the point-based compensation approach relies on a measurement technology more accurate than CT and preferentially traceable to the SI unit of length, such as a classical coordinate measuring machine (CMM) - labeled as R-CMS (Reference Coordinate Measuring System). This is an imperative assumption, which is though quite reasonable considering that CMM outcomes are appropriate for most GD&T-based inspection processes. From the point density point of view, it implies that the CMM point density is sufficient to perform tolerance assessment and therefore no extra information would be needed on other locations of a particular feature. In other words, the point-based compensation approach equalizes the point density of both techniques, with no risk to the overall tolerance assessment. Second, the point-based compensation approach considers that the CT output - labeled as L-CT (Line CT) is repeatable, with a constant bias at each specific point location. This means that by extracting virtually the same measurement point from several CT datasets of the same part, measured under repeatability conditions, considerably smaller point location variation would be observed. By repeatability conditions in CT measurements one means keeping constant or under certain limits most of the influence quantities and factors mentioned in Figure 1. Considering the two basic assumptions just described, the point-based compensation approach can be applied to L-CT datasets of nearly identical parts, i.e. sufficient dimensional stability and material homogeneity must be exhibited by the production parts over time. The measurement routine can be understood as four macro operations, as follows: (1) A typical part is selected and designated as the master part, which is calibrated in an upper-echelon R-CMS using a measurement script appropriate for checking the dimensional and geometrical contents of the part. The upper-echelon laboratory may use extra techniques to reduce uncertainty as needed (e.g. substitution measurement method, repeated measurements, etc). The R-CMS dataset (i.e. the measurement value of each sampling-point position used to define each feature) is then recorded in a master-part file. (2) The L-CT scans the master part using a proper CT setting (e.g. tube voltage and current, filter plate, integration time, voxel size, part orientation) and further processing steps (e.g. surface determination algorithm, beam hardening reduction algorithm), which have to be kept fixed in order to apply the compensation vector. The complete measuring cycle has to be repeated at least five times in order to estimate the point variation (the mean value is used to calculate the correction value). The L-CT datasets are then recorded in line-part files. (3) The point-compensation algorithm correlates the L-CT dataset with the R-CMS dataset, filters out the uncorrelated points and calculates the difference between each L-CT mean point and the corresponding R-CMS point. The point compensation vector/file is produced (representing the L-CT task-specific point error), which contains all the actual unknown systematic effects of the L-CT compared to the R-CMS. (4) The L-CT starts measuring regular parts following the same script predefined in item 2. The resulting L-CT dataset is extracted as a STL (or text) file, filtered out considering the R-CMS dataset, and then the point-compensation vector (item 3) is applied to each filtered point. Thereby, the systematic error of each filtered L-CT point is corrected. This way, the uncertainty of each L-CT point can be reduced to its own measurement repeatability plus the uncertainty associated with the calibrated points. www.3dct.at 2

3 Experimental measurement design In order to study the point-based compensation approach effect on the GD&T tolerance verification, the exemplary prismatic part illustrated in Figure 2 was selected. The figure also exhibits the datums and tolerances assigned to the part geometries, which may profit from the proposed method. Being a proof-of-concept study, though, only the flatness assigned to the primary datum A is covered in this paper. Particularly for CT measurement, it is worth mentioning that the exemplary prismatic part material is aluminum, featuring a nominal density of 2700 kg/m 3 and a body diagonal of approximately 100 mm. Figure 2: Illustrative image of the exemplary part indicating the assigned feature tolerances and the respective datums. 3.1 CMM reference measurements The intrinsic characteristics of two exemplary prismatic parts were calibrated on a Carl Zeiss PRISMO ultra CMM housed in a temperature-controlled laboratory kept at (20.0 ± 0.5) C - the R-CMS in this study. For the primary datum A, scanning mode was used to sample the grid exhibited in Figure 3. The R-CMS point cloud was thus created and the single point uncertainty estimated using the Virtual CMM software output and expert judgment. Data evaluation was then entirely executed on KOTEM SmartProfile 4.5 analysis software using the reference point cloud. Considering only the flatness tolerance under scrutiny, minimum zone method was chosen to fit the measured points to the nominal geometry, as defined in GD&T standards. The reference value was found as 0.0206 mm ± 0.0005 mm (k = 2) for part #1 and 0.0212 mm ± 0.0005 mm (k = 2) for part #2. Figure 3: Point pattern created and sampled on the exemplary parts using the scanning measurement mode of the R-CMS. 3.2 CT measurements The two exemplary parts were separately scanned on a Carl Zeiss METROTOM 1500 CT - the L-CT in this study, equipped with a 225 kv micro-focus tube and a 2048 2 pixel flat panel detector. The CT scanner is housed in a temperature-controlled room kept at (20 ± 1) C. To scan the parts, the magnification axis was positioned to project each part using the maximum www.3dct.at 3

possible area of the detector with 1024 2 pixel resolution (binning mode). The X-ray tube voltage was set high enough to avoid complete beam extinction, and the detector integration time set to a convenient value. The source current was set to enhance image contrast and brightness. This way, the spatial resolution remained limited by the focal spot size of 110 µm and not by the resulting voxel size of 146 µm. The number of angular steps was selected as about twice the number of pixels covered by the resulting shadow of the test part in the projection. For both parts, L-CT built-in post-processing beam hardening reduction function was enabled. Table 1 summarises the CT setting parameters for the L-CT scans (five repeated measurements on each part performed under repeatability conditions), which were optimized for voxel dataset accuracy, and not for scanning time. X-ray source. tube voltage. tube current. focal spot size. prefilter Flat panel detector. integration time. image average. resolution mode. pixel pitch Test sample setup. voxel size. image magnification. number of projections. relative geometrical unsharpness [20] 220 kv 500 µa 110 µm 3.0 mm Cu 4000 ms (no average) 1024 2 pixel 400 µm 146 µm 2.74 1500 0.48 Table 1: CT control settings chosen for scanning both exemplary parts. To determine the surface from the voxel dataset of the mono-material parts, the standard iso-50% threshold value was applied globally. The point cloud could then be created and exported with different default presets in VGStudio MAX 3.0 software. In this work only the superprecise default preset was considered in further analyses; in fact, no statistical difference has been evidenced in the form error estimates and the direction vectors of the datums when using different presets [21]. Therefore, five line-part files were generated for each exemplary part. Data evaluation was again entirely executed on KOTEM SmartProfile 4.5 analysis software using each L-CT dataset. For the flatness tolerance under scrutiny the mean value of five measurements was found as 0.226 mm for part #1 and 0.234 mm for part #2. 4 Point-based compensation results The previous section outlined the first two macro operations associated with the point-based compensation approach outlined in section 2. Here, the implementation of the other two macro operations is presented and discussed under two perspectives: (a) same part (#2) used for creating and applying the compensation vector; (b) other part (#1) used for applying the compensation vector. 4.1 Biased analysis: same part used for creating and applying the compensation vector First the master-part file of part #2 has to be prepared (i.e. aligned to place points in proper positions using points sampled in other elements), to be used as R-CMS dataset. Figure 4 illustrates the resulting R-CMS evaluation of the flatness tolerance. The actual surface profile can be clearly identified using the colour map on the left and the deviation-to-nominal histogram on the right. The actual flatness deviation is that mentioned in subsection 3.1 (and also exhibited in the figure). Figure 4: R-CMS flatness evaluation of part #2, showing the deviation colour map (left) and the deviation histogram (right). www.3dct.at 4

The point-compensation database is then created using the prepared master-part file and the five line-part files associated with part #2. In this operation, the L-CT dataset is intelligently reduced to the corresponding points of the R-CMS dataset, and the compensation vector is created, which carries the L-CT task-specific point error. In order to approach the L-CT dataset to the R-CMS dataset, first a spherical envelope (of radius 1 mm for this study) is virtually created for each R-CMS point; then, the average of the L-CT points enclosed by the envelope is computed and the deviation to the reference value can be estimated. For illustrative purpose, the L-CT evaluation of the flatness tolerance of part #2 (first measurement) is displayed in Figure 5 using the filtered points only. The L-CT calculated flatness value is about ten times greater than the R-CMS output and the shape does not resemble that obtained with the R-CMS. That is because the actual shape of the part is in fact hidden by the CT background noise. This behavior becomes even more evident in the deviation-to-nominal histogram; a very different deviation distribution could be observed (same colour scale applied to all analyses). Figure 5: L-CT flatness evaluation of part #2 (first measurement, filtered), displaying the deviation colour map (left) and the deviation histogram (right). The point-compensation vector is then applied to each corresponding L-CT filtered point of part #2, producing the results displayed in Figure 6. The flatness value is now 0.030 mm for part #2 (first measurement); the surface shape and the deviation histogram are comparable to the R-CMS results (same colour scale applied to all analyses). Because of the point compensation algorithm nature, the deviation histogram shape approaches a normal distribution. This is a consequence of the central limit theorem, which states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough [22]. Similar behavior could be observed for all five L-CT measurements of part #2; the uncompensated and compensated flatness evaluation values are listed in Table 2. Figure 6: Flatness evaluation of part #2 (first measurement, filtered) after applying the compensation vector to the same points sampled with L-CT; deviation colour map displayed on the left and deviation histogram displayed on the right. The significant reduction of flatness values after individually compensating each L-CT point can be regarded as a very positive result of the proposed algorithm. The L-CT flatness mean value of part #2 after point compensation is (0.033 ± 0.020) mm; the uncertainty is determined similar to ISO 15530-3 method, i.e. as root-sum-squared combination of the calibration uncertainty and the measurement procedure uncertainty multiplied by the coverage factor k = 2. The most dominant uncertainty factor is definitely the CT background noise, which composes the measurement procedure uncertainty. The result is more encouraging when comparing the compensated L-CT flatness value with the R-CMS flatness value. The normalized error (En-value), which www.3dct.at 5

in this study expresses the difference between the two flatness values compared to the associated uncertainties, is used for this purpose [23]. En-value equals to 0.59 was calculated for this first scenario, which suggests satisfactory agreement between the two measurement results. Measurement L-CT Uncompensated L-CT Compensated Part #2 (r1) 0.233 0.030 Part #2 (r2) 0.236 0.033 Part #2 (r3) 0.232 0.033 Part #2 (r4) 0.232 0.034 Part #2 (r5) 0.236 0.037 Table 2: L-CT uncompensated and compensated flatness values for part #2 (values in millimeters). 4.2 Unbiased analysis: different part used for applying the compensation vector In order to emulate the forth (and latter) macro operation outlined in section 2 and to prevent the biased analysis described in subsection 4.1, the same compensation vector previously created was applied to the L-CT dataset of part #1. Despite the very similar flatness values found for each part using the R-CMS (please refer to subsection 3.1 for the corresponding values), the actual surface shape displays significative differences, particularly close to the corners, as shown in Figure 7. Figure 7: R-CMS flatness evaluation of part #1, showing the deviation colour map (left) and the deviation histogram (right). The L-CT evaluation of the flatness tolerance of part #1 (first measurement) is shown in Figure 8 using only the filtered points. Again the L-CT calculated flatness value is about ten times greater than the R-CMS output and the shape does not resemble that obtained with the R-CMS. On the other hand, the shape is nearly the same as that observed for the L-CT of part #2 (Figure 5), which confirms the previous impression that the actual shape is masked by the CT dataset noise. The deviation histogram also presents a similar shape (same colour scale applied to all analyses). Figure 8: L-CT flatness evaluation of part #1 (first measurement, filtered), displaying the deviation colour map (left) and the deviation histogram (right). Finally the point-compensation vector can be applied to each corresponding L-CT filtered point of part #1, producing the results displayed in Figure 9. The flatness value is now 0.047 mm for part #1 (first measurement); however, the surface shape and the deviation histogram are not as close to the R-CMS results as evidenced in subsection 4.1. www.3dct.at 6

The L-CT flatness mean value of part #1 after point compensation is (0.050 ± 0.020) mm (k = 2); the dominant uncertainty factor is by far the CT background noise. When comparing the compensated L-CT flatness value (and associated measurement uncertainty) with the corresponding R-CMS flatness value (and associated measurement uncertainty), En = 1.44 was obtained. This implies poor agreement between the two measurement results, which is a direct consequence of the noisy CT dataset that supprises the actual shape of the surface. In other words, the surface signature (actual shape variation) is small comparing to the CT measurement uncertainty and the L- CT is thus unable to resolve the surface signature. Therefore, the unsatisfactory agreement observed in this unbiased analysis can be explained by the actual shape variation, which is lower than the intrinsic method uncertainty. In this particular situation, if significant changes due to manufacturing were expected, then a new R-CMS dataset (and a new master-part file) would be necessary to generate the compensation vector. However, prior to confirming this latter assumption, further noise reduction strategies would need to be implemented in different stages of the measurement process (e.g. when setting CT parameters, when applying enhancement algorithms at projection level and at voxel dataset level) in order to improve the point-based compensation outcome. Figure 9: Flatness evaluation of part #1 (first measurement, filtered) after applying the compensation vector to the same points sampled with L-CT; deviation colour map displayed on the left and deviation histogram displayed on the right. 5 Concluding remarks The recently issued document of the Consultative Committee for Length (CCL) identifies issues that must be addressed by the metrology community. Particularly for coordinate metrology, digital manufacturing is driving demand for vastly higher point coordinate data density, and the interpretation of these massive datasets presents new challenges [24]. Intelligent filtering of the large number of data points generated by CT systems and development of correction procedures to reduce measurement deviations of less accurate but volumetric sensors with measurement data of more accurate sensors have been recognized as demands for coordinate metrology [25]. Therefore the point-based compensation approach described in this document is in full consonance with the current needs in coordinate metrology. By compensating the CT point dataset, the task-specific measurement uncertainty can be estimated and the traceability to the SI unit of length, guaranteed. In fact, from the experimental tests, a well-defined bias could be observed for the CT datasets - point compensation is thus worthy, but with a random variation that cannot be neglected in the uncertainty budget. CT point compensation is additionally justified by the wrong shape provided by the technology due to image artefacts, which may jeopardize product tolerance analysis and manufacturing investigation. Experimental findings have shown good agreement between R-CMS dataset and L-CT dataset after wise point reduction and correction in the biased analysis, i.e. same part used for creating and applying the compensation vector, which emulates the situation in which the actual feature shape remains nearly constant over time. This is a prerequisite that could be added to those conditions listed in section 2 whenever the actual feature variation is small comparing to the measurement method uncertainty. Indeed, for the unbiased analysis, i.e. other part used for applying the compensation vector, unsatisfactory agreement between R-CMS dataset and L-CT dataset after wise point reduction and compensation could be observed, due to significant changes between the actual surface shapes of the two exemplary parts, which could not be resolved by the L-CT. The poor agreement observed in this unbiased analysis found an answer in the actual shape variation, which is significantly less than the intrinsic method uncertainty. In this very particular case, if significant changes due to manufacturing were expected, then a new R-CMS dataset would be necessary to create the compensation vector. However, this matter has to be further investigated in some aspects, such as: (a) applying enhancement algorithms at projection and voxel levels, (b) reducing the structural resolution by changing CT tube and detector parameters (which may limit the size and material of the part to be scanned), and even (c) enlarging the part form error to a level compatible with the CT capability to capture the surface signature. Last but not least, despite the issues realized by the unbiased analysis, the point-based compensation approach described in this paper definitely offers the potential to reduce the measurement uncertainty down to the level of the CT measurement process repeatability for all features on a part of arbitrary complexity, measured by any operator-selected method (i.e. considering the www.3dct.at 7

influence quantities and factors shown in Figure 1). Moreover, the full potential of the measurement data can be realized with significant reduction of computational effort and increase of measurement accuracy. References [1] W Dewulf, K Kiekens, Y Tan, F Welkenhuyzen and J-P Kruth, Uncertainty determination and quantification for dimensional measurements with industrial computed tomography, CIRP Annals - Manufacturing Technology 62, 2013 [2] J-P Kruth, A Weckenmann, S Carmignato, R Schmitt, M Bartscher and L De Chiffre, Computed tomography for dimensional metrology, CIRP Annals 60(2), 2011 [3] S Carmignato, Accuracy of industrial computed tomography measurements: Experimental results from an international comparison, CIRP Annals - Manufacturing Technology 61, 2012 [4] M Bartscher, M Neukamm, U Hilpert, U Neuschaefer-Rube, F Härtig, K Kniel, K Ehrig, A Staude and J Goebbels, Achieving traceability of industrial computed tomography, Key Engineering Materials, Vol. 437, 2010 [5] R Schmitt and C Niggermann, Uncertainty in measurement for X-ray computed tomography using calibrated work pieces, Measurement Science and Technology, Vol. 21, 2010 [6] P Müller, J Hiller, A Cantatore and L De Chiffre, Investigation of measuring strategies in computed tomography, New Technologies in Manufacturing, ISBN 978-8021442672, 2011 [7] P Müller, J Hiller, Y Dai, J L Andreasen, H N Hansen and L De Chiffre, Estimation of measurement uncertainties in X- ray computed tomography metrology using the substitution method, CIRP Journal of Manufacturing Science and Technology, Vol. 7, pp. 222-232, 2014 [8] C R Baldo, T L Fernandes and G D Donatelli, Experimental evaluation of the uncertainty associated with the result of feature-of-size measurements through computed tomography, Journal of Physics: Conference Series 733 (2016) 012056 [9] C R Baldo, T L Fernandes and G D Donatelli, Exploratory analysis of voxel size effects on CT measurements of situation characteristics of type point-to-point distance, Journal of Physics: Conference Series 733 (2016) 012057 [10] J Hiller and L M Reindl, A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography, Measurement, Vol. 45, 2012 [11] P Wenig and S Kasperl, Examination of the measurement uncertainty on dimensional measurements by X-ray computed tomography, European Conference on Non-Destructive Testing (ECNDT), Berlin, Germany (Sep. 2006) [12] A Weckenmann and P Krämer, Assessment of measurement uncertainty caused in the preparation of measurements using computed tomography, IMEKO World Congress, Lisbon, Portugal (Sep. 2009) [13] K Kiekens, F Welkenhuyzen, Y Tan, P Bleys, A Voet, W Dewulf and J-P Kruth, A test object for calibration and accuracy assessment in X-ray CT metrology, International Symposium on Measurement and Quality Control (ISMQC), Osaka, Japan (Sep. 2010) [14] W Dewulf, Y Tan and K Kiekens, Sense and non-sense of beam hardening correction in CT metrology, CIRP Annals - Manufacturing Technology 61, 2012 [15] L Franco, J A Yagüe-Fabra, R Jiménez, M Maestro and S Ontiveros, Error sources analysis of computed tomography for dimensional metrology: an experimental approach, European Conference on Non-Destructive Testing (ECNDT), Prague, Czech Republic (Oct. 2014) [16] J Schlecht, E Ferley, S Coughlin, S D Phillips, V D Lee and C M Shakarji, Evaluating CT for metrology: The influence of material thickness on measurements, European Conference on Non-Destructive Testing (ECNDT), Prague, Czech Republic (Oct. 2014) [17] C R Baldo, T L Fernandes and G D Donatelli, Experimental investigation of computed tomography dimensional capability using modular test artefacts, IMEKO World Congress, Prague, Czech Republic (Sep. 2015) [18] C R Baldo, T L Fernandes and G D Donatelli, Experimental assessment of computed tomography dimensional performance using modular test pieces made of polyoxymethylene and aluminum, Conference on Industrial Computed Tomography, Wels, Austria (Feb. 2016) [19] C W Brown, Feature-based tolerancing for intelligent inspection process definition, Forum on Dimensional Tolerance - and Metrology Dearborn MI USA, 1993 [20] F A Arenhart, C R Baldo, T L Fernandes and G D Donatelli, Experimental investigation of the influence factors on the structural resolution for dimensional measurements with CT systems, Conference on Industrial Computed Tomography, Wels, Austria (Feb. 2016) [21] C R Baldo, T L Fernandes and G D Donatelli, Computed tomography dataset handling for geometrical tolerance analysis of prismatic test pieces, International Congress of Mechanical Metrology, Fortaleza, Brazil (Nov. 2017) [22] D C Montgomery and G C Runger, Applied statistics and probability for engineers, Wiley, New York, NY, USA, 832 p, 2013 [23] H N Hansen and L De Chiffre, Nordic audit of coordinate measuring machines, EUROMET Project No 330, Nordic Industrial Fund Project No P93046, Final Report, 1996 [24] Consultative Committee for Length (CCL): Current status and strategy (2016), 9 pp. Retrieved October 22, 2017, from https://www.bipm.org/en/committees/cc/ccl/publications-cc.html [25] Final Publishable JRP Summary for IND59 Microparts: Multi-sensor metrology for microparts in innovative industrial products (2016), 6 pp. Retrieved October 22, 2017, from https://www.ptb.de/emrp/microparts.html www.3dct.at 8