Effective and Efficient Use of the Fundamentals of Engineering Exam for Outcomes Assessment Goals/ABET Criteria Addressed The goals of this presentation are to: 1) make participants aware of the changes that have been incorporated in the new FE Examination; 2) explain how to obtain FE exam results in a format that is useful for outcomes assessment; 3) demonstrate methods for interpretation of sub-tests in the FE Exam of importance to specific engineering programs; 4) suggest methods to make using the FE exam for outcomes assessment; and 5) illustrate how using the results for program improvement can be documented. In the ABET Engineering Criteria 2000, engineering program outcomes and assessment are the subject of Criterion 3. The assessment measures are to be used in determining how successful the program is in achieving the outcomes important to the educational goals and mission of the institution. In part, the criterion states that one of the possible outcomes assessment measures may include nationally-normed subject content examinations. One examination that deals with the subject matter in engineering programs is the Fundamentals of Engineering Examination offered by the National Council of Examiners for Engineering and Surveying (NCEES). Presentation Format This session consists of a presentation in an informal lecture style. Since specific examples of examination format and the interpretation of the results by subject content are included, interaction with those attending the session regarding each part of the examination and its results is encouraged. Participants consider examination subjects in light of the emphasis of their own curricula and actively participate in setting the priorities they should establish for their own program in interpreting the FE exam results. Examples of graphical methods to summarize important FE exam results demonstrate how several institutions use these results for effective and efficient outcomes assessment. Use of the results for documentation of program improvements is also illustrated. Session Summary The NCEES Fundamentals of Engineering Examination (FE Exam) has been administered to candidates for professional engineering licensure for many years. The ABET Engineering Criteria 2000 includes Criterion 3, Program Outcomes and Assessment, which reads, in part: The assessment process must demonstrate that the outcomes important to the mission of the institution and the objectives of the program, including those listed above, are being measured. It is clear that ABET is calling for an assessment process, not just one method to measure student achievement of learning objectives. But it is also clear that a nationally-normed examination such as the FE examination can be a valuable contribution to assessing outcomes for a program if the examination is properly constructed and if its results are properly applied. Until recently, the FE Exam tested for knowledge of engineering subjects that were common to all disciplines and were primarily found in the first two years of engineering study. The revised FE Exam is different in both content and format than the exam administered before October of 1996 and while the candidate pass rate on the FE examination is useful to state boards of registration, the
examination reports provide a rich source of information which is available to individual deans and department chairs if they choose to use them to evaluate and improve their programs. The FE Exam can be useful for outcomes assessment to the extent that its content has construct validity. In this regard, the presentation provides background on how the NCEES arrived at the specifications for the new examination. A task force with membership from NCEES, ASEE, ABET, the Engineering Deans Council (EDC) and NSPE was charged with the development of the specifications for an examination that could be more useful for assessment than the old FE Exam. It was recognized by the sponsoring organizations that such an examination would constitute only one part of a many-faceted assessment process. The focus of the task force was on the development of the knowledge-based specifications for the examination component of the process. Those specifications were subsequently adopted by NCEES and are the basis for the new FE Exam that was first administered in October 1996. The exam specifications for both lower division and the various discipline-specific upper division modules are presented and discussed. The FE Exam is administered to tens of thousands of engineering seniors and recent graduates each year. The results provide both statistically significant comparisons and a rich source of data for deans and department chairs to learn about the performance of their own students in comparison to national norms in specific engineering content areas. For faculty working on assessment plans, it is important to understand the kind of reports available from the NCEES at no cost to the institution and how to efficiently use the resulting data. The detailed breakdown of the performance of groups of students (such as those enrolled in Electrical Engineering) in specific subject areas (such as in digital system design) makes the new FE Exam a powerful and useful assessment tool to be used along with the other outcomes measures. This presentation features specific examples from both lower division and upper division modules in several disciplines. Key Words FE Examination, Outcomes Assessment, Licensing Exams, Engineering Exams, EIT Bibliography Fundamentals of Engineering Sample Questions, National Council of Examiners for Engineering and Surveying, Clemson, SC, 1996. Fundamentals of Engineering Discipline Specific Reference Handbook, National Council of Examiners for Engineering and Surveying, Clemson, SC, 1996. Session Presenters Dr. John W. Steadman, P.E. Dr. David L. Whitman, P.E. Dr. Walter LeFevre, P.E. Dean of Engineering Associate Dean of Engineering University Professor Emeritus 307 University Blvd, EGCB 108 Dept 3295, 1000 E. University Ave. #4190 BELL, CVEG, UAF Mobile, AL 36688 Laramie, WY 82071 Fayetteville, AR 72701 University of South Alabama University of Wyoming University of Arkansas Ph: 251-460-6140 Ph: 307-766-4253 Ph: 479-575-6024 Email: jsteadman@usouthal.edu Email: whitman@uwyo.edu Email: ewl@engr.uark.edu
FUNDAMENTALS OF ENGINEERING (FE) EXAMINATION Content Specifications Beginning with the October 1996 examination, the Fundamentals of Engineering Examination was offered in five disciplines and one general, non-specific discipline examination, in the afternoon session. Listed below are the topics that the examinations will cover and the percentage of questions. The morning session is common to all disciplines. The morning session will consist of 120 questions that count 1 point each, for a total of 120 points. MORNING SESSION SUBJECTS AND PERCENTAGE OF QUESTIONS Chemistry 9% Fluid Mechanics 7% Computers 5% Materials Sci./Structure of Matter 7% Dynamics 8% Mathematics 20% Electrical Circuits 10% Mechanics of Materials 7% Engineering Economics 4% Statics 10% Ethics 4% Thermodynamics 9% AFTERNOON SESSION SPECIFICATIONS (EFFECTIVE OCTOBER 2002) Listed below are the topics that the examinations will cover and the percentage of questions. The afternoon session will consist of 60 questions that count 2 points each, for a total of 120 points. CHEMICAL CIVIL Chemical Reaction Engineering 10% Computers & Numerical Methods 10% Chemical Thermodynamics 10% Construction Management 5% Computer & Numerical Methods 5% Environmental Engineering 10% Heat Transfer 10% Hydraulics & Hydrologic Systems 10% Mass Transfer 10% Legal & Professional Aspects 5% Material/Energy Balances 15% Soil Mechanics & Foundations 10% Pollution Prevention 5% Structural Analysis 10% Process Control 5% Structural Design 10% Process Design & Economics 10% Surveying 10% Process Equipment Design 5% Transportation Facilities 10% Process Safety 5% Water Purification & Treatment 10% Transport Phenomena 10%
ELECTRICAL MECHANICAL Analog Electronic Circuits 10% Automatic Controls 5% Communications Theory 10% Computer 5% Computer & Numerical Methods 5% Dynamic Systems 10% Computer Hardware Engineering 5% Energy Conversion & Power Plants 5% Computer Software Engineering 5% Fans, Pumps, & Compressors 5% Control Systems Theory & Analysis 10% Fluid Mechanics 10% Digital Systems 10% Heat Transfer 10% Electromagnetic Theory & Appl. 10% Material Behavior/Processing 5% Instrumentation 5% Measurement & Instrumentation 10% Network Analysis 10% Mechanical Design 10% Power Systems 5% Refrigeration & HVAC 5% Signal Processing 5% Stress Analysis 10% Solid State Electronics & Devices 10% Thermodynamics 10% INDUSTRIAL GENERAL Computer Computations & Modeling 5% Chemistry 7.5% Design of Industrial Experiments 5% Computers 5% Engineering Economics 5% Dynamics 7.5% Engineering Statistics 5% Electrical Circuits 10% Facility Design & Location 5% Engineering Economics 5% Industrial Cost Analysis 5% Ethics 5% Industrial Ergonomics 5% Fluid Mechanics 7.5% Industrial Management 5% Material Science/Structure of Matter 5% Information System Design 5% Mathematics 20% Manufacturing Processes 5% Mechanics of Materials 7.5% Manufacturing Systems Design 5% Statics 10% Material Handling System Design 5% Thermodynamics 10% Math Optimization & Modeling 5% Production Planning & Scheduling 5% ENVIRONMENTAL Productivity Measurement & Mngmnt 5% Queuing Theory & Modeling 5% Water Resources 25% Simulation 5% Water & Wastewater Engineering 30% Statistical Quality Control 5% Air-Quality Engineering 15% Total Quality Management 5% Solid- & Hazardous-Waste Engr 15% Work Performance & Methods 5% Environmental Science & Mngt 15%
Sample of Report 5 that is available from NCEES National Council of Examiners for Engineering and Surveying (NCEES) Fundamentals of Engineering Examination OCTOBER 2003 Administration Report 5 - Subject Matter Report by Major Based on Particular PM Examination Selected ABET-Accredited Engineering Program Examinees Currently Enrolled in School Board: XXXXXX Name of Institution: Undergraduates Board Code: XX Institution Code: 00 Major: CIVIL PM Examination Selected: CIVIL This State Nat'l Institution Comparator Groupings No. Examinees Taking 22 67 1688 1049 247 179 No. Examinees Passing 20 57 1369 882 205 132 % Examinees Passing 91% 85% 81% 84% 83% 74% AM Subject (1 point each) Carnegie Carnegie Carnegie This Rsrch/Doc Rsrch/Doc Masters Institution Nat'l Nat'l Extensive Intensive Comp. Number Average Average Standard Average Average Average of Exam Percent Percent Deviation Percent Percent Percent Questions Correct Correct Correct Correct Correct CHEMISTRY 11 59 61 2.1 63 62 59 COMPUTERS 7 47 53 1.4 55 49 53 DYNAMICS 9 62 64 1.8 66 66 59 ELECTRICAL CIRCUITS 12 44 41 2.1 41 45 39 ENGINEERING ECON 5 74 72 1.1 72 71 70 ETHICS 5 65 66 1 66 65 67 FLUID MECHANICS 8 56 56 1.6 58 58 51 MAT SCI/STR MATTER 8 64 61 1.5 62 62 56 MATHEMATICS 24 54 59 3.8 61 59 57 MECH OF MATERIALS 8 56 63 1.5 65 64 62 STATICS 12 58 61 2.2 63 64 58 THERMODYNAMICS 11 50 45 2.1 46 48 40 PM Subject (2 points each) CONSTRUCTION MGMNT 3 55 43 0.9 43 43 41 COMP & NUM METHODS 6 50 53 1.2 55 54 51 ENVIRONMENTAL ENGR 6 62 55 1.1 56 58 52 HYDRAUL/HYDROLOG SYS 6 58 59 1.4 59 63 54 LEGAL & PROF ASPECTS 3 83 75 0.8 75 74 74 STRUCTURAL ANALYSIS 6 61 64 1 64 66 62 STRUCTURAL DESIGN 6 45 48 1.3 47 52 46 SOIL MECH & FOUNDAT 6 56 56 1.4 58 58 53 SURVEYING 6 51 45 1.2 45 47 47 TRANS FACILITIES 6 56 50 1.3 50 52 49 WATER PUR & TREAT 6 51 51 1.5 50 53 49
While the evaluation of a specific exam administration can lead to short-term conclusions, assessment is better performed by utilizing longitudinal data over several exam offerings. Data is presented graphically in three different formats 1) raw score; 2) ratio of raw score; and 3) use of a standardized score that will be defined later. The three sets of graphs shown below can be generated for any combination of major, pm exam taken, and subject matter. For formats 2) and 3), the graphs present the data as well as a pre-determined goal to be achieved. Format 1) Comparison of Institutional Raw Scores to National Raw Scores (% correct) Format 2) Ratio of Institutional Score to National Score along with pre-determined goal
Format 3) Standardized Score, Range, and pre-determined goal The standardized score allows the institution to quickly view how many standard deviations their students are either above or below the national score. The range on the standardized score allows for a degree of uncertainty that accounts for small number of takers at any particular institution. Definitions : Institutional Score - National Score Standardized Score = National Standard Deviation Range on Standardized Score = ± 1 # of Institutional Takers