Materials and Testing Audit Department-wide Report Introduction This report has been prepared for the Transportation Commission, TxDOT Administration and management. The report presents the results of the Construction Division s Materials and Testing Audit which was part of the Fiscal Year 2010 Audit Plan. The objectives of the audit are to determine if: Materials are from approved sources Materials are sampled and tested in compliance with project requirements Material samplers are certified for the tests performed Pay estimate approvals and processing are separated, and All ARRA special provisions are included and monitored in ARRA funded contracts. Background The SiteManager Contract Administration User Manual contains the following - TxDOT monitors construction contracts in SiteManager. Contract administration in SiteManager includes the following activities: Compliance with contract documents and specifications Enforcing State and Federal regulations Ensuring quality control by overseeing, inspecting, and reviewing sampling and testing of all materials and work, and Keeping and maintaining good project records. SiteManager was developed as part of a $10 million AASHTO joint development project sponsored by 20 transportation agencies. The system automates many of the administrative functions handled manually on construction projects. Materials incorporated in TxDOT projects are subjected to various quality assurance procedures such as testing, certification, quality monitoring, approved lists, etc. The document that outlines the methods and the minimum sampling and testing rates is called the Guide Schedule. The Guide Schedule is the basis for customizing each project s sampling and testing requirements that are set up in SiteManager s Materials Management function. In addressing the results of a State Auditor s Office (SAO) A-133 audit of TxDOT s compliance with federal requirements on construction contracts, on February 17, 2010, David Casteel, P.E., Assistant Executive Director for Field and District Operations, issued a memorandum stating the following: Phase I- CST (Construction Division) will develop automated procedures to reduce manual test data input and simplify the input process. Currently, inspectors enter extensive data for all tests. Districts will have the option to only enter the test result. With this change, the certified tester who performed the test must be entered into the test record. Report P20110 1 of 6 May 18, 2011
Phase 1 will be completed June 2010. At that time, SiteManager will become the official sampling and testing record, and all sampling and testing data must be recorded in SiteManager. There is a Phase II to Mr. Casteel s action plan: Phase II SiteManager sampling and testing procedures will be reorganized to simplify and accurately reflect the Guide Schedule. This process will be simplified, aligned with actual construction operations, and modified to improve consistency with the Guide Schedule. Phase II will be completed by September 2011. The current audit did not evaluate the implementation of Phase II as activities were still in the planning stage as of November 2010. Scope Audit team members included Polita Flemming, Milan Hawkins, and Roberto Manzo. The audit work was conducted during the period of December 2009 to March 1, 2011. Audit work was postponed from January to September 2010 while CST implemented changes in response to the SAO s A-133 audit results. All work was conducted in conformance with the International Standards for the Professional Practice of Internal Auditing of The Institute of Internal Auditors. Audit work consisted of researching the following: FHWA regulations regarding project produced materials The Guide Schedule which outlines material quality assurance procedures Materials tested by the Central Lab (CST/M&P), placed on the Master Producers List and may be released for multiple projects District lab operations which include AQMP (Aggregate Quality Monitoring Program) and non-aqmp stockpile testing; as well as tests required by the project managers Labinator and SiteManager Testers by District Verification/acceptance testing by the project inspectors Various methods to record the sampling and test information The ARRA Special Provisions, and Conducted the audit work at the Abilene, Amarillo, Beaumont, El Paso, and Pharr districts. Opinion TxDOT is adequately managing the sampling and testing of materials used in the selected projects. In addition, no significant concerns were indicated with the tracking of material sources, certification of samplers, segregation of estimate payment duties, and adherence with ARRA program requirements. However, some opportunities for improvement are noted in the Findings/Observations sections below. Report P20110 2 of 6 May 18, 2011
Summary of Findings 1. Not all sampling test results (performed after June 1, 2010) are shown in SiteManager. 2. The test recording process misrepresents the responsibilities assigned to individual(s) that perform the sampled by, tested by, reviewed by, and/or authorized by functions. 3. Sample test deficiencies are not resolved in an expedient manner. Detailed Findings and Recommendations Finding No.1: Not all sampling test results (performed after June 1, 2010) are shown in SiteManager. Forty-four point six percent (44.6 %, 37 of 83) of quality assurance tests performed by the districts did not indicate the test results in the test templates examined in SiteManager. Inspectors were under the misunderstanding that copying quality assurance test results for material approved by other districts was supposed to automatically populate the test results onto their SiteManager templates. In other instances, for material approved by CST/M&P, the links to the supporting test results are unworkable or difficult to navigate. One district s policy is to keep the results of assurance tests performed by the district lab in a log and not load the information into SiteManager. Criteria: On June 1, 2010, the SiteManager database became TxDOT s official sampling and testing record. Districts are responsible for insuring that minimum material testing requirements are met for each project, and are required to enter the test results in the SiteManager database. Effect: The lack of information in SiteManager, or the inability to effectively locate test results, impedes the determination if material testing frequencies met reporting requirements or that substandard material were not utilized. Recommendation: Because the sample created by the copy will contain all data within the original sample except for the test data, when using the Copy Sample option, the Master Sample ID should be clearly indicated. As instructed by Mr.Casteel s memo, and a follow-up memo issued on December 10, 2010, by Russel W. Lenz, P.E., Construction Division (CST) Director, all test results should be recorded in SiteManager. Management Response and Action Plan: Responsible Party: Russel Lenz Implementation Date: Dec 2011 CST-C will address the Copy Sample option to bring test data into the sample, not a blank template. In the case of Quality Monitoring (QM) aggregate samples, the results are a rating for the producer generated by applying a 99% confidence limit to the latest 5 QM samples from the producer. As such, the QM has ratings and not specific test data for a sample. We will add a note on these entries that samples used to determine the statistical rating are found in the Laboratory Information Management System (LIMS) and provide a link Report P20110 3 of 6 May 18, 2011
to the LIMS, if possible. CST-C will randomly review project sampling and testing to assure that results of assurance tests performed by the district lab are loaded into SiteManager (reference Finding 3). Finding No. 2: The test recording process misrepresents the responsibilities assigned to individual(s) that perform the sampled by, tested by, reviewed by, and/or authorized by functions. Personnel in non-inspection and/or testing activities that simply source the material or update unspecified materials are listed as the test reviewer/authorizer. On the test template a comment bubble in the reviewer field asserts This field is populated when a user records the test s stamp code. Also, twenty-two point nine percent (22.9%, 19 of 83) of the test templates examined showed that the same individual was the sampler, reviewer, and authorizer. As explained by CST management, the off-site (Quality Monitoring Program) materials have already undergone sampling, testing, and acceptance by qualified personnel which attest to meeting the specifications. The individual(s) sourcing or updating the material are making the correct selections from a list of pre-approved QM samples (i.e. approved aggregate sources on the AQMP, pretested asphalt sources, etc.) that have been confirmed to meet the specific requirements (i.e. surface aggregate classification, grade of asphalt, etc.) of the contract. Due to a lack of resources some district policies allow each person to authorize their own samples while other offices prefer to have one designated person review and authorize all samples. Criteria: Per the SiteManager Materials Management Manual, Chapter 8 Sampling and Testing Guidelines: Prior to authorization, the authorizer (individual accepting technical responsibility for the use of the material and has determined that the material meets the requirements for the intended use) should verify the following in SiteManager: test result meets the specification requirement for the project sampler is qualified to sample the material sample is associated to a contract(s) and its line items. If not, contact the district for clarification prior to authorization actual completion date has been entered Status, Sample Type and Acceptance Method are completed as outlined in the Stamp Codes section. Effect: Unless non-inspection and/or testing activity personnel confirm that the test results meet the specification requirements of the contract, the current practice is not a capable control. When an inspector is listed as a sampler, reviewer, and authorizer, questions arise as to the substance of the process. Recommendation: For clarity, and certainty of responsibilities, the procedures (SiteManager Materials Management) manual should address when the area engineer is deferring their authority to operational staff and the scope of that assignment. Another option could be to enter in the remarks field (on the test template) an explanation as to an individual s actual participation (relative to the situations outlined above). Lastly, while the sampler, tester, and authorizer Report P20110 4 of 6 May 18, 2011
functions have a defined connotation in the SiteManager Materials Management manual, the reviewer function is indistinct. A description of a reviewer s tasks within the context of the sampling and testing activities is necessary. Management Response and Action Plan: Responsible Party: Russel Lenz Implementation Date: Dec 2011 CST-C will modify the current procedures to include, in the remarks, the role the individual is taking when copying QM samples. With the next update of the Materials Management Manual, the reviewer function will be defined. SiteManager will be modified so that the approval of samples requires the approval of at least two separate individuals. Finding No. 3: Sample test deficiencies are not resolved in an expedient manner. Deficiencies arise when tests have not been performed, tests are incorrectly assigned, test results have not been completed, or Quality Monitoring (QM) results have not been certified. A threemonth tracking of the deficiencies associated with the projects selected for our sample (off of the Xite Reports), indicated that twenty-four point two percent (24.2%, 121 of 500) deficiencies under the districts and CST/M&P s purview had not been resolved. Many deficiencies were related to the one (1) test per project requirement for the material item. In order to not hold up the monthly progress estimates, the practice is to turn off the Deficient Sample Indicator. The indicator notifies project management that sample testing deficiencies exist when an estimate is created. Another stated practice as voiced by district personnel, was to resolve any remaining deficiencies during a project s Final Review. Criteria: Deficiencies are generated when insufficient samples exist to meet Contract Sampling and Testing Requirements based on installed quantities. Effect: Not resolving deficiencies at the earliest event possible, particularly waiting until the performance of a project s final review, eliminates the timely assessment of the quality of materials used. Recommendation: Area office personnel should make it a practice to resolve deficiencies during the preparation of the monthly progress estimates. The district labs should synchronize with the Central Lab (CST/M&P) a more effective way to address QM related deficiencies. Management Response and Action Plan: Responsible Party: Russel Lenz Implementation Date: July 2011 CST will reinforce to the districts the importance of timely resolution of testing deficiencies. CST-C will add review of sampling and testing requirements to our routine Change Order (CO) review process. The contracts selected for monthly 10% CO review will also be included for monthly review of sampling and testing deficiencies. The annual 100% CO review will also include a review of the discrepancy status of all contracts in the district. Report P20110 5 of 6 May 18, 2011
Observations The following additional observations are presented for consideration in improving the efficiency of operations and may be deemed helpful when assessing changes to be made during the Phase II update: An inference of excessive testing by state forces: For base work or sub-base work, the practice is not to record a failing test. While not trying to discourage TxDOT s monitoring of projects, the responsibility should be on the contractors to perform their own testing and inform TxDOT when the work is ready for assurance testing. Question the identity of the inspector that created a test: The ability to select the inspector s name from a drop down menu allows anyone to create a sample using someone else s name. Different certification data depending on the database queried: The Labinator (an add-on created by CST which includes the most current information on certifications) and SiteManager district tester databases show different information for the same testers. Administrative delays hinder updating access statuses: The regions are not providing the status of deactivation (of former employees) to the districts. May hamper the effectiveness of statewide desk audits: Districts are still proposing developing internal SOPs to address sampling and testing problems. Source letters do not reflect added or deleted sources: Sourcing letter information is not getting updated when the contractor(s) make material source revisions. Closing Comments The results of this audit were discussed with the CST Director, CST Section Director, and CST/M&P Director on March 28, 2011. We thank all employees who assisted us during this audit. Report P20110 6 of 6 May 18, 2011