Data validation and reconciliation - Wikipedia
|
|
- Adam Gilbert
- 5 years ago
- Views:
Transcription
1 1 de :24 Data validation and reconciliation From Wikipedia, the free encyclopedia Industrial process data validation and reconciliation, or more briefly, data validation and reconciliation (DVR), is a technology that uses process information and mathematical methods in order to automatically correct measurements in industrial processes. The use of DVR allows for extracting accurate and reliable information about the state of industry processes from raw measurement data and produces a single consistent set of data representing the most likely process operation. Contents 1 Models, data and measurement errors 1.1 Error types 1.2 Necessity of removing measurement errors 2 History 3 Data reconciliation 3.1 Redundancy Example of calculable and non-calculable systems 3.2 Benefits 4 Data validation 4.1 Data filtering 4.2 Result validation 4.3 Gross error detection 5 Advanced data validation and reconciliation 5.1 Thermodynamic models 5.2 Gross error remediation 5.3 Workflow 6 Applications 7 See also 8 References M odels, data and measurement er ror s Industrial processes, for example chemical or thermodynamic processes in chemical plants, refineries, oil or gas production sites, or power plants, are often represented by two fundamental means: 1. Models that express the general structure of the processes, 2. Data that reflects the state of the processes at a given point in time. Models can have different levels of detail, for example one can incorporate simple mass or compound conservation balances, or more advanced thermodynamic models including energy conservation laws. Mathematically the model can be expressed by a nonlinear system of equations in the variables, which incorporates all the above-mentioned system constraints (for example the mass or heat balances around a unit). A variable could be the temperature or the pressure at a certain place in the plant. Error types
2 2 de :24 Random and systematic er ror s Normally distributed measurements without bias. Normally distributed measurements with bias. Data originates typically from measurements taken at different places throughout the industrial site, for example temperature, pressure, volumetric flow rate measurements etc. To understand the basic principles of DVR, it is important to first recognize that plant measurements are never 100% correct, i.e. raw measurement is not a solution of the nonlinear system. When using measurements without correction to generate plant balances, it is common to have incoherencies. Measurement errors can be categorized into two basic types: 1. random errors due to intrinsic sensor accuracy and 2. systematic errors (or gross errors) due to sensor calibration or faulty data transmission. Random errors means that the measurement is a random variable with mean, where is the true value that is typically not known. A systematic error on the other hand is characterized by a measurement which is a random variable with mean, which is not equal to the true value. For ease in deriving and implementing an optimal estimation solution, and based on arguments that errors are the sum of many factors (so that the Central limit theorem has some effect), data reconciliation assumes these errors are normally distributed. Other sources of errors when calculating plant balances include process faults such as leaks, unmodeled heat losses, incorrect physical properties or other physical parameters used in equations, and incorrect structure such as unmodeled bypass lines. Other errors include unmodeled plant dynamics such as holdup changes, and other instabilities in plant operations that violate steady state (algebraic) models. Additional dynamic errors arise when measurements and samples are not taken at the same time, especially lab analyses. The normal practice of using time averages for the data input partly reduces the dynamic problems. However, that does not completely resolve timing inconsistencies for infrequently-sampled data like lab analyses. This use of average values, like a moving average, acts as a low-pass filter, so high frequency noise is mostly eliminated. The result is that, in practice, data reconciliation is mainly making adjustments to correct systematic errors like biases. Necessity of removing measurement er ror s ISA-95 is the international standard for the integration of enterprise and control systems [1] It asserts that: Data reconciliation is a serious issue for enterprise-control integration. The data have to be valid to be useful for the enterprise system. The data must often be determined from physical measurements that have associated error factors. This must usually be converted into exact values for the enterprise system. This conversion may require manual, or intelligent reconciliation of the converted values [...]. Systems must be set up to ensure that accurate data
3 3 de :24 are sent to production and from production. Inadvertent operator or clerical errors may result in too much production, too little production, the wrong production, incorrect inventory, or missing inventory. Histor y DVR has become more and more important due to industrial processes that are becoming more and more complex. DVR started in the early 1960s with applications aiming at closing material balances in production processes where raw measurements were available for all variables. [2] At the same time the problem of gross error identification and elimination has been presented. [3] In the late 1960s and 1970s unmeasured variables were taken into account in the data reconciliation process., [4][5] DVR also became more mature by considering general nonlinear equation systems coming from thermodynamic models., [6], [7] [8] Quasi steady state dynamics for filtering and simultaneous parameter estimation over time were introduced in 1977 by Stanley and Mah. [7] Dynamic DVR was formulated as a nonlinear optimization problem by Liebman et al. in [9] Data reconciliation Data reconciliation is a technique that targets at correcting measurement errors that are due to measurement noise, i.e. random errors. From a statistical point of view the main assumption is that no systematic errors exist in the set of measurements, since they may bias the reconciliation results and reduce the robustness of the reconciliation. Given measurements, data reconciliation can mathematically be expressed as an optimization problem of the following form: where is the reconciled value of the -th measurement ( ), is the measured value of the -th measurement ( ), is the -th unmeasured variable ( ), and is the standard deviation of the -th measurement ( ), are the process equality constraints and are the bounds on the measured and unmeasured variables. The term is called the penalty of measurement i. The objective function is the sum of the penalties, which will be denoted in the following by. In other words, one wants to minimize the overall correction (measured in the least squares term) that is needed in order to satisfy the system constraints. Additionally, each least squares term is weighted by the standard deviation of the corresponding measurement. Redundancy
4 4 de :24 Sensor and topological redundancy Sensor redundancy arising from multiple sensors of the same quantity at the same time at the same place. Topological redundancy arising from model information, using the mass conservation constraint, for example one can calculate, when and are known. Data reconciliation relies strongly on the concept of redundancy to correct the measurements as little as possible in order to satisfy the process constraints. Here, redundancy is defined differently from redundancy in information theory. Instead, redundancy arises from combining sensor data with the model (algebraic constraints), sometimes more specifically called "spatial redundancy", [7] "analytical redundancy", or "topological redundancy". Redundancy can be due to sensor redundancy, where sensors are duplicated in order to have more than one measurement of the same quantity. Redundancy also arises when a single variable can be estimated in several independent ways from separate sets of measurements at a given time or time averaging period, using the algebraic constraints. Redundancy is linked to the concept of observability. A variable (or system) is observable if the models and sensor measurements can be used to uniquely determine its value (system state). A sensor is redundant if its removal causes no loss of observability. Rigorous definitions of observability, calculability, and redundancy, along with criteria for determining it, were established by Stanley and Mah, [10] for these cases with set constraints such as algebraic equations and inequalities. Next, we illustrate some special cases: Topological redundancy is intimately linked with the degrees of freedom ( ) of a mathematical system, [11] i.e. the minimum number of pieces of information (i.e. measurements) that are required in order to calculate all of the system variables. For instance, in the example above the flow conservation requires that. One needs to know the value of two of the 3 variables in order to calculate the third one. The degrees of freedom for the model in that case is equal to 2. At least 2 measurements are needed to estimate all the variables, and 3 would be needed for redundancy. When speaking about topological redundancy we have to distinguish between measured and unmeasured variables. In the following let us denote by the unmeasured variables and the measured variables. Then the system of the process constraints becomes, which is a nonlinear system in and. If the system is calculable with the measurements given, then the level of topological redundancy is defined as, i.e. the number of additional measurements that are at hand on top of those measurements which are required in order to just calculate the system. Another way of viewing the level of redundancy is to use the definition of, which is the difference between the number of variables (measured and unmeasured) and the number of equations. Then one gets
5 5 de :24 i.e. the redundancy is the difference between the number of equations and the number of unmeasured variables. The level of total redundancy is the sum of sensor redundancy and topological redundancy. We speak of positive redundancy if the system is calculable and the total redundancy is positive. One can see that the level of topological redundancy merely depends on the number of equations (the more equations the higher the redundancy) and the number of unmeasured variables (the more unmeasured variables, the lower the redundancy) and not on the number of measured variables. Simple counts of variables, equations, and measurements are inadequate for many systems, breaking down for several reasons: (a) Portions of a system might have redundancy, while others do not, and some portions might not even be possible to calculate, and (b) Nonlinearities can lead to different conclusions at different operating points. As an example, consider the following system with 4 streams and 2 units. Example of calculable and non-calculable systems Calculable and non-calculable systems Calculable system, from one can compute, and knowing yields. non-calculable system, knowing does not give information about and. We incorporate only flow conservation constraints and obtain and. It is possible that the system is not calculable, even though. If we have measurements for and, but not for and, then the system cannot be calculated (knowing does not give information about and ). On the other hand, if and are known, but not and, then the system can be calculated. In 1981, observability and redundancy criteria were proven for these sorts of flow networks involving only mass and energy balance constraints. [12] After combining all the plant inputs and outputs into an "environment node", loss of observability corresponds to cycles of unmeasured streams. That is seen in the second case above, where streams a and b are in a cycle of unmeasured streams. Redundancy classification follows, by testing for a path of unmeasured streams, since that would lead to an unmeasured cycle if the measurement was removed. Measurements c and d are redundant in the second case above, even though part of the system is unobservable. Benefits Redundancy can be used as a source of information to cross-check and correct the measurements and increase their accuracy and precision: on the one hand they reconciled Further, the data reconciliation problem presented above also includes unmeasured variables. Based on information redundancy, estimates for these unmeasured variables can be calculated along with their accuracies. In industrial processes these unmeasured variables that data reconciliation provides are referred to as soft sensors or virtual sensors, where hardware sensors are not installed.
6 6 de :24 Data validation Data validation denotes all validation and verification actions before and after the reconciliation step. Data filter ing Data filtering denotes the process of treating measured data such that the values become meaningful and lie within the range of expected values. Data filtering is necessary before the reconciliation process in order to increase robustness of the reconciliation step. There are several ways of data filtering, for example taking the average of several measured values over a well-defined time period. Result validation Result validation is the set of validation or verification actions taken after the reconciliation process and it takes into account measured and unmeasured variables as well as reconciled values. Result validation covers, but is not limited to, penalty analysis for determining the reliability of the reconciliation, or bound checks to ensure that the reconciled values lie in a certain range, e.g. the temperature has to be within some reasonable bounds. Gross er ror detection Result validation may include statistical tests to validate the reliability of the reconciled values, by checking whether gross errors exist in the set of measured values. These tests can be for example the chi square test (global test) the individual test. If no gross errors exist in the set of measured values, then each penalty term in the objective function is a random variable that is normally distributed with mean equal to 0 and variance equal to 1. By consequence, the objective function is a random variable which follows a chi-square distribution, since it is the sum of the square of normally distributed random variables. Comparing the value of the objective function with a given percentile of the probability density function of a chi-square distribution (e.g. the 95th percentile for a 95% confidence) gives an indication of whether a gross error exists: If, then no gross errors exist with 95% probability. The chi square test gives only a rough indication about the existence of gross errors, and it is easy to conduct: one only has to compare the value of the objective function with the critical value of the chi square distribution. The individual test compares each penalty term in the objective function with the critical values of the normal distribution. If the -th penalty term is outside the 95% confidence interval of the normal distribution, then there is reason to believe that this measurement has a gross error. Advanced data validation and reconciliation Advanced data validation and reconciliation (DVR) is an integrated approach of combining data reconciliation and data validation techniques, which is characterized by complex models incorporating besides mass balances also thermodynamics, momentum balances, equilibria constraints, hydrodynamics etc. gross error remediation techniques to ensure meaningfulness of the reconciled values, robust algorithms for solving the reconciliation problem.
7 7 de :24 Ther modynamic models Simple models include mass balances only. When adding thermodynamic constraints such as energy balances to the model, its scope and the level of redundancy increases. Indeed, as we have seen above, the level of redundancy is defined as, where is the number of equations. Including energy balances means adding equations to the system, which results in a higher level of redundancy (provided that enough measurements are available, or equivalently, not too many variables are unmeasured). Gross er ror remediation Gross errors are measurement systematic errors that may bias the reconciliation results. Therefore it is important to identify and eliminate these gross errors from the reconciliation process. After the reconciliation statistical tests can be applied that indicate whether or not a gross error does exist somewhere in the set of measurements. These techniques of gross error remediation are based on two concepts: gross error elimination gross error relaxation. Gross error elimination determines one measurement that is biased by a systematic error and discards this measurement from the data set. The determination of the measurement to be discarded is based on different kinds of penalty terms that express how much the measured values deviate from the reconciled values. Once the gross errors are detected they are discarded from the measurements and the reconciliation The workflow of an advanced data validation and reconciliation process. can be done without these faulty measurements that spoil the reconciliation process. If needed, the elimination is repeated until no gross error exists in the set of measurements. Gross error relaxation targets at relaxing the estimate for the uncertainty of suspicious measurements so that the reconciled value is in the 95% confidence interval. Relaxation typically finds application when it is not possible to determine which measurement around one unit is responsible for the gross error (equivalence of gross errors). Then measurement uncertainties of the measurements involved are increased. It is important to note that the remediation of gross errors reduces the quality of the reconciliation, either the redundancy decreases (elimination) or the uncertainty of the measured data increases (relaxation). Therefore it can only be applied when the initial level of redundancy is high enough to ensure that the data reconciliation can still be done (see Section 2, [11] ). Wor kflow Advanced DVR solutions offer an integration of the techniques mentioned above: 1. data acquisition from data historian, data base or manual inputs 2. data validation and filtering of raw measurements 3. data reconciliation of filtered measurements
8 8 de :24 4. result verification range check gross error remediation (and go back to step 3) 5. result storage (raw measurements together with reconciled values) The result of an advanced DVR procedure is a coherent set of validated and reconciled process data. Applications DVR finds application mainly in industry sectors where either measurements are not accurate or even non-existing, like for example in the upstream sector where flow meters are difficult or expensive to position (see [13] ); or where accurate data is of high importance, for example for security reasons in nuclear power plants (see [14] ). Another field of application is performance and process monitoring (see [15] ) in oil refining or in the chemical industry. As DVR enables to calculate estimates even for unmeasured variables in a reliable way, the German Engineering Society (VDI Gesellschaft Energie und Umwelt) has accepted the technology of DVR as a means to replace expensive sensors in the nuclear power industry (see VDI norm 2048, [11] ). See also Process simulation Pinch analysis Industrial processes Chemical engineering References 1. "ISA-95: the international standard for the integration of enterprise and control systems" ( 95.com/). isa-95.com. 2. D.R. Kuehn, H. Davidson, Computer Control II. Mathematics of Control, Chem. Eng. Process 57: 44 47, V. Vaclavek, Studies on System Engineering I. On the Application of the Calculus of the Observations of Calculations of Chemical Engineering Balances, Coll. Czech Chem. Commun 34: 3653, V. Vaclavek, M. Loucka, Selection of Measurements Necessary to Achieve Multicomponent Mass Balances in Chemical Plant, Chem. Eng. Sci. 31: , R.S.H. Mah, G.M. Stanley, D.W. Downing, Reconciliation and Rectification of Process Flow and Inventory Data, Ind. & Eng. Chem. Proc. Des. Dev. 15: , ( /ReconciliationRectificationProcessData-1976.pdf) 6. J.C. Knepper, J.W. Gorman, Statistical Analysis of Constrained Data Sets, AiChE Journal 26: , G.M. Stanley and R.S.H. Mah, Estimation of Flows and Temperatures in Process Networks, AIChE Journal 23: , ( 8. P. Joris, B. Kalitventzeff, Process measurements analysis and validation, Proc. CEF 87: Use Comput. Chem. Eng., Italy, 41 46, M.J. Liebman, T.F. Edgar, L.S. Lasdon, Efficient Data Reconciliation and Estimation for Dynamic Processes Using Nonlinear Programming Techniques, Computers Chem. Eng. 16: , Stanley G.M. and Mah, R.S.H., "Observability and Redundancy in Process Data Estimation, Chem. Engng. Sci. 36, 259 (1981) ( ObservabilityRedundancy.pdf) 11. VDI-Gesellschaft Energie und Umwelt, "Guidelines - VDI 2048 Blatt 1 - Uncertainties of measurements at acceptance tests for energy conversion and power plants - Fundamentals", Association of German Engineers ( Stanley G.M., and Mah R.S.H., "Observability and Redundancy Classification in Process Networks", Chem. Engng. Sci. 36, 1941 (1981) (
9 9 de :24 ObservabilityRedundancyProcessNetworks.pdf) 13. P. Delava, E. Maréchal, B. Vrielynck, B. Kalitventzeff (1999), Modelling of a Crude Oil Distillation Unit in Term of Data Reconciliation with ASTM or TBP Curves as Direct Input Application : Crude Oil Preheating Train, Proceedings of ESCAPE-9 conference, Budapest, May 31-June 2, 1999, supplementary volume, p M. Langenstein, J. Jansky, B. Laipple (2004), Finding Megawatts in nuclear power plants with process data validation, Proceedings of ICONE12, Arlington, USA, April 25 29, Th. Amand, G. Heyen, B. Kalitventzeff, Plant Monitoring and Fault Detection: Synergy between Data Reconciliation and Principal Component Analysis, Comp. and Chem, Eng. 25, p , Alexander, Dave, Tannar, Dave & Wasik, Larry "Mill Information System uses Dynamic Data Reconciliation for Accurate Energy Accounting" TAPPI Fall Conference 2007.[1] ( Rankin, J. & Wasik, L. "Dynamic Data Reconciliation of Batch Pulping Processes (for On-Line Prediction)" PAPTAC Spring Conference S. Narasimhan, C. Jordache, Data reconciliation and gross error detection: an intelligent use of process data, Golf Publishing Company, Houston, V. Veverka, F. Madron, 'Material and Energy Balancing in the Process Industries, Elsevier Science BV, Amsterdam, J. Romagnoli, M.C. Sanchez, Data processing and reconciliation for chemical process operations, Academic Press, Retrieved from " oldid= " Categories: Data management This page was last modified on 20 September 2016, at 09:32. Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.
Design and Retrofit of Sensor Networks in Process Plants
Design and Retrofit of Sensor Networks in Process Plants Miguel J. Bagajewicz School of Chemical Engineering and Materials Science, University of Oklahoma, Norman, OK 73019 A novel procedure to find costoptimal
More informationPrograma EngIQ- EPS. Programa doutoral em Engenharia da Refinação, Petroquímica e Química. Engenharia de Processos e Sistemas. Process Simulators
Programa doutoral em Engenharia da Refinação, Petroquímica e Química Engenharia de Processos e Sistemas Process Simulators Fernando G. Martins Departamento de Engenharia Química Faculdade de Engenharia
More informationError Analysis, Statistics and Graphing
Error Analysis, Statistics and Graphing This semester, most of labs we require us to calculate a numerical answer based on the data we obtain. A hard question to answer in most cases is how good is your
More informationNon-linear dynamic data reconciliation for industrial processes
Loughborough University Institutional Repository Non-linear dynamic data reconciliation for industrial processes This item was submitted to Loughborough University's Institutional Repository by the/an
More informationCalibrating HART Transmitters. HCF_LIT-054, Revision 1.1
Calibrating HART Transmitters HCF_LIT-054, Revision 1.1 Release Date: November 19, 2008 Date of Publication: November 19, 2008 Document Distribution / Maintenance Control / Document Approval To obtain
More informationHOW TO PROVE AND ASSESS CONFORMITY OF GUM-SUPPORTING SOFTWARE PRODUCTS
XX IMEKO World Congress Metrology for Green Growth September 9-14, 2012, Busan, Republic of Korea HOW TO PROVE AND ASSESS CONFORMITY OF GUM-SUPPORTING SOFTWARE PRODUCTS N. Greif, H. Schrepf Physikalisch-Technische
More informationDefine the problem and gather relevant data Formulate a mathematical model to represent the problem Develop a procedure for driving solutions to the
Define the problem and gather relevant data Formulate a mathematical model to represent the problem Develop a procedure for driving solutions to the problem Test the model and refine it as needed Prepare
More informationLagrangian and Eulerian Representations of Fluid Flow: Kinematics and the Equations of Motion
Lagrangian and Eulerian Representations of Fluid Flow: Kinematics and the Equations of Motion James F. Price Woods Hole Oceanographic Institution Woods Hole, MA, 02543 July 31, 2006 Summary: This essay
More informationDeveloping Optimization Algorithms for Real-World Applications
Developing Optimization Algorithms for Real-World Applications Gautam Ponnappa PC Training Engineer Viju Ravichandran, PhD Education Technical Evangelist 2015 The MathWorks, Inc. 1 2 For a given system,
More informationONLINE PROCESS SIMULATION
ONLINE PROCESS SIMULATION ONLINE, REAL-TIME AND PREDICTIVE PROCESS DATA INTEGRATION WITH CHEMSTATIONS SOFTWARE PLANT2CC/PLANT2CCD INTERFACE Nor-Par Online A/S 1 All the terms below mean exactly the same
More informationUsing the Scaling Equations to Define Experimental Matrices for Software Validation
Using the Scaling Equations to Define Experimental Matrices for Software Validation Richard R. Schultz, Edwin Harvego, Brian G. Woods, and Yassin Hassan V&V30 Standards Committee Presentation Content Description
More informationNOVEL APPROACH OF DATA RECONCILIATION IN CEMENT MILL FOR KERNEL PCR ALGORITHM
NOVEL APPROACH OF DATA RECONCILIATION IN CEMENT MILL FOR KERNEL PCR ALGORITHM B. Dinesh Kumar 1, M. Guruprasath 2 and Komanapalli Venkata Lakshmi Narayana 1 1 SELECT, VIT University, Vellore, Tamil Nadu,
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationInternational Journal of Advance Engineering and Research Development. Flow Control Loop Analysis for System Modeling & Identification
Scientific Journal of Impact Factor(SJIF): 3.134 e-issn(o): 2348-4470 p-issn(p): 2348-6406 International Journal of Advance Engineering and Research Development Volume 2,Issue 5, May -2015 Flow Control
More informationProcess simulation as a domain- specific OPC UA information model
Process simulation as a domain- specific OPC UA information model Paolo Greppi, consultant, 3iP, Italy ESCAPE 20 June 6 th to 9 th 2010 Ischia, Naples (Italy) Presentation outline Classic OPC OPC Unified
More information1. Assumptions. 1. Introduction. 2. Terminology
4. Process Modeling 4. Process Modeling The goal for this chapter is to present the background and specific analysis techniques needed to construct a statistical model that describes a particular scientific
More informationDATA RECONCILIATION AND INSTRUMENTATION UPGRADE. OVERVIEW AND CHALLENGES.
DATA RECONCILIATION AND INSTRUMENTATION UPGRADE. OVERVIEW AND CHALLENGES. Miguel J. Bagajewicz University of Oklahoma, 100 E. Boyd T-335, Norman, OK 7307 Abstract This paper discusses the state of the
More informationData Preprocessing. Why Data Preprocessing? MIT-652 Data Mining Applications. Chapter 3: Data Preprocessing. Multi-Dimensional Measure of Data Quality
Why Data Preprocessing? Data in the real world is dirty incomplete: lacking attribute values, lacking certain attributes of interest, or containing only aggregate data e.g., occupation = noisy: containing
More informationThe Power of Analysis Framework
All too often, users must create real-time planning and analysis reports with static and inconsistent sources of information. Data is locked in an Excel spreadsheet or a rigidly customized application
More informationDI TRANSFORM. The regressive analyses. identify relationships
July 2, 2015 DI TRANSFORM MVstats TM Algorithm Overview Summary The DI Transform Multivariate Statistics (MVstats TM ) package includes five algorithm options that operate on most types of geologic, geophysical,
More informationUsers all around the globe now make routine use of H/CAMS for the following crucial tasks:
Authorized Distributer TECHNOTRADE 7-A Bank Square Market, Model Town, Lahore Pakistan Tel: +92-42-35832403 Fax: +92-42-35832467 Email: sales@technotrade.com.pk Website: www.technotrade.com.pk Haverly
More informationInstrumentation design based on optimal Kalman filtering
Journal of Process Control 15 (2005) 629 638 www.elsevier.com/locate/jprocont Instrumentation design based on optimal Kalman filtering Estanislao Musulin a, Chouaib Benqlilou a, Miguel J. Bagajewicz b,
More informationLecture 3: Linear Classification
Lecture 3: Linear Classification Roger Grosse 1 Introduction Last week, we saw an example of a learning task called regression. There, the goal was to predict a scalar-valued target from a set of features.
More informationAdvanced Control Foundation: Tools, Techniques and Applications. Terrence Blevins Willy K. Wojsznis Mark Nixon
Advanced Control Foundation: Tools, Techniques and Applications Terrence Blevins Willy K. Wojsznis Mark Nixon Contents Acknowledgments About the Authors vii xvii Foreword xxi Chapter 1 INTRODUCTION 1 Chapter
More information1.1 What is Microeconomics?
1.1 What is Microeconomics? Economics is the study of allocating limited resources to satisfy unlimited wants. Such a tension implies tradeoffs among competing goals. The analysis can be carried out at
More informationDeriving safety requirements according to ISO for complex systems: How to avoid getting lost?
Deriving safety requirements according to ISO 26262 for complex systems: How to avoid getting lost? Thomas Frese, Ford-Werke GmbH, Köln; Denis Hatebur, ITESYS GmbH, Dortmund; Hans-Jörg Aryus, SystemA GmbH,
More informationIntegrated Scheduling for Gasoline Blending Considering Storage Tanks and Pipe Network
Integrated Scheduling for Gasoline Blending Considering Storage Tanks and Pipe Network Satoshi Hoshino, Noriyoshi Furuya, and Hiroya Seki Abstract An off-site system in a petroleum refining plant mainly
More informationUnderstanding MFC Metrology & Calibration
Understanding MFC Metrology & Calibration Factors Impacting MFC Performance in Biopharmaceutical Process Systems 1 Understanding MFC Metrology & Calibration Mass flow controllers (MFCs) precisely deliver
More informationEngineering SCADA (Introduction) Dr. Sasidharan Sreedharan
Engineering SCADA (Introduction) Dr. Sasidharan Sreedharan www.sasidharan.webs.com Contents What is a SCADA System Simple SCADA System lay out and working Components of a SCADA System Levels of SCADA -
More informationCurve fitting. Lab. Formulation. Truncation Error Round-off. Measurement. Good data. Not as good data. Least squares polynomials.
Formulating models We can use information from data to formulate mathematical models These models rely on assumptions about the data or data not collected Different assumptions will lead to different models.
More informationFast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation
Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation November 2010 Nelson Shaw njd50@uclive.ac.nz Department of Computer Science and Software Engineering University of Canterbury,
More informationTHE preceding chapters were all devoted to the analysis of images and signals which
Chapter 5 Segmentation of Color, Texture, and Orientation Images THE preceding chapters were all devoted to the analysis of images and signals which take values in IR. It is often necessary, however, to
More informationAdaptation and testing of data reconciliation software for CAPE-OPEN compliance
19 th European Symposium on Computer Aided Process Engineering ESCAPE19 J. Jeowski and J. Thullie (Editors) 2009 Elsevier B.V./Ltd. All rights reserved. Adaptation and testing of data reconciliation software
More informationINVERSE PROBLEMS IN GROUNDWATER MODELING
INVERSE PROBLEMS IN GROUNDWATER MODELING Theory and Applications of Transport in Porous Media Series Editor: Jacob Bear, Technion - Israel Institute of Technology, Haifa, Israel Volume 6 The titles published
More informationChemometrics. Description of Pirouette Algorithms. Technical Note. Abstract
19-1214 Chemometrics Technical Note Description of Pirouette Algorithms Abstract This discussion introduces the three analysis realms available in Pirouette and briefly describes each of the algorithms
More informationInclusion of Aleatory and Epistemic Uncertainty in Design Optimization
10 th World Congress on Structural and Multidisciplinary Optimization May 19-24, 2013, Orlando, Florida, USA Inclusion of Aleatory and Epistemic Uncertainty in Design Optimization Sirisha Rangavajhala
More informationOutput-error model identification: linear time-invariant systems
Output-error model identification: linear time-invariant systems Dipartimento di Scienze e Tecnologie Aerospaziali, Politecnico di Milano The OE method for LTI systems 2 For linear time-invariant systems
More informationINCOOP Workshop Düsseldorf, January 23-24, 2003
Plant-wide on-line dynamic modeling with state estimation: Application to polymer plant operation and involvement in trajectory control and optimization. Philippe Hayot Global Process Engineering The Dow
More informationNEW FEATURES IN CHEMCAD VERSION 6: VERSION RELEASE NOTES. CHEMCAD New Features and Enhancements. CHEMCAD Maintenance
NEW FEATURES IN CHEMCAD VERSION 6: Uses a single mode for flowsheet drawing, specification, calculation, and PFD creation Creates single-file simulations that are easy to work with and share Easy cloning
More informationRoute Map (Start September 2012) Year 9
Route Map (Start September 2012) Year 9 3 th 7 th Sept 10 th -14 th Sept 17 th 21 st Sept 24 th 28 th Sept 1 st -5 th Oct 8 th -12 th Oct 15 th 19 th Oct 22 nd -26 th Oct 29 th Oct-2 nd Nov 5 th -9 th
More informationModel based soft-sensors based on OPC Unified Architecture
Model based soft-sensors based on OPC Unified Architecture Paolo Greppi, consultant, 3iP, Italy POWER-GEN Europe 2010 Conference June 10th, 2010 Amsterdam Presentation outline The problem Old solution
More informationCOPYRIGHTED MATERIAL INTRODUCTION TO ASPEN PLUS CHAPTER ONE
CHAPTER ONE INTRODUCTION TO ASPEN PLUS Aspen Plus is based on techniques for solving flowsheets that were employed by chemical engineers many years ago. Computer programs were just beginning to be used,
More informationSinar large seed Moisture Analyser
Sinar large seed Moisture Analyser Easy to use no sampling, weighing or grinding, simply fill and read Fast instant digital readout of moisture and temperature Robust rugged construction with no moving
More informationNew Rules of ME Ph.D. Qualifying Exams 1/8
New Rules of ME Ph.D. Qualifying Exams 1/8 Qualifying Examination The student must pass a Qualifying Examination before the Dissertation Director, the Interdisciplinary Committee, and the courses for the
More informationObject-Oriented and Classical Software Engineering
Slide 6.1 Object-Oriented and Classical Software Engineering Seventh Edition, WCB/McGraw-Hill, 2007 Stephen R. Schach srs@vuse.vanderbilt.edu CHAPTER 6 Slide 6.2 TESTING 1 Overview Slide 6.3 Quality issues
More informationObject-Oriented and Classical Software Engineering
Slide 6.1 Object-Oriented and Classical Software Engineering Seventh Edition, WCB/McGraw-Hill, 2007 Stephen R. Schach srs@vuse.vanderbilt.edu CHAPTER 6 Slide 6.2 TESTING Overview Slide 6.3 Quality issues
More informationGE Energy. Bently Nevada * Essential Insight.mesh * Wireless Condition Monitoring
GE Energy Bently Nevada * Essential Insight.mesh * Wireless Condition Monitoring Increasing Reliability in Plants Today No matter where an asset sits in the plant, the need to understand its health has
More informationModeling with Uncertainty Interval Computations Using Fuzzy Sets
Modeling with Uncertainty Interval Computations Using Fuzzy Sets J. Honda, R. Tankelevich Department of Mathematical and Computer Sciences, Colorado School of Mines, Golden, CO, U.S.A. Abstract A new method
More informationgood check of volumetric accuracy. However, if the mea error components. However, if the errors measured are large,
REVIEW OF SCIENTIFIC INSTRUMENTS VOLUME 71, NUMBER 10 OCTOBER 2000 Laser vector measurement technique for the determination and compensation of volumetric positioning errors. Part I: Basic theory Charles
More informationMixture Models and EM
Mixture Models and EM Goal: Introduction to probabilistic mixture models and the expectationmaximization (EM) algorithm. Motivation: simultaneous fitting of multiple model instances unsupervised clustering
More informationIntroduction to C omputational F luid Dynamics. D. Murrin
Introduction to C omputational F luid Dynamics D. Murrin Computational fluid dynamics (CFD) is the science of predicting fluid flow, heat transfer, mass transfer, chemical reactions, and related phenomena
More informationBusiness Impacts of Poor Data Quality: Building the Business Case
Business Impacts of Poor Data Quality: Building the Business Case David Loshin Knowledge Integrity, Inc. 1 Data Quality Challenges 2 Addressing the Problem To effectively ultimately address data quality,
More informationCh 5 Industrial Control Systems
Ch 5 Industrial Control Systems Sections: 1. Process Industries vs. Discrete Manufacturing Industries 2. Continuous vs. Discrete Control 3. Computer Process Control Industrial Control - Defined The automatic
More informationDesignDirector Version 1.0(E)
Statistical Design Support System DesignDirector Version 1.0(E) User s Guide NHK Spring Co.,Ltd. Copyright NHK Spring Co.,Ltd. 1999 All Rights Reserved. Copyright DesignDirector is registered trademarks
More informationHow ISA Technical Divisions Benefited my Career
How ISA Technical Divisions Benefited my Career also known as. Introduction to ISA Technical Divisions Standards Certification Education & Training Publishing Conferences & Exhibits Speaker: Graham Nasby
More informationTruncation Errors. Applied Numerical Methods with MATLAB for Engineers and Scientists, 2nd ed., Steven C. Chapra, McGraw Hill, 2008, Ch. 4.
Chapter 4: Roundoff and Truncation Errors Applied Numerical Methods with MATLAB for Engineers and Scientists, 2nd ed., Steven C. Chapra, McGraw Hill, 2008, Ch. 4. 1 Outline Errors Accuracy and Precision
More informationMachine Learning (CSMML16) (Autumn term, ) Xia Hong
Machine Learning (CSMML16) (Autumn term, 28-29) Xia Hong 1 Useful books: 1. C. M. Bishop: Pattern Recognition and Machine Learning (2007) Springer. 2. S. Haykin: Neural Networks (1999) Prentice Hall. 3.
More informationround decimals to the nearest decimal place and order negative numbers in context
6 Numbers and the number system understand and use proportionality use the equivalence of fractions, decimals and percentages to compare proportions use understanding of place value to multiply and divide
More informationTesting! Prof. Leon Osterweil! CS 520/620! Spring 2013!
Testing Prof. Leon Osterweil CS 520/620 Spring 2013 Relations and Analysis A software product consists of A collection of (types of) artifacts Related to each other by myriad Relations The relations are
More informationReal-time Monitoring of Multi-mode Industrial Processes using Feature-extraction Tools
Real-time Monitoring of Multi-mode Industrial Processes using Feature-extraction Tools Y. S. Manjili *, M. Niknamfar, M. Jamshidi Department of Electrical and Computer Engineering The University of Texas
More informationCOMPUTATIONAL FLUID DYNAMICS ANALYSIS OF ORIFICE PLATE METERING SITUATIONS UNDER ABNORMAL CONFIGURATIONS
COMPUTATIONAL FLUID DYNAMICS ANALYSIS OF ORIFICE PLATE METERING SITUATIONS UNDER ABNORMAL CONFIGURATIONS Dr W. Malalasekera Version 3.0 August 2013 1 COMPUTATIONAL FLUID DYNAMICS ANALYSIS OF ORIFICE PLATE
More informationThe Encoding Complexity of Network Coding
The Encoding Complexity of Network Coding Michael Langberg Alexander Sprintson Jehoshua Bruck California Institute of Technology Email: mikel,spalex,bruck @caltech.edu Abstract In the multicast network
More informationThwarting Traceback Attack on Freenet
Thwarting Traceback Attack on Freenet Guanyu Tian, Zhenhai Duan Florida State University {tian, duan}@cs.fsu.edu Todd Baumeister, Yingfei Dong University of Hawaii {baumeist, yingfei}@hawaii.edu Abstract
More informationPractical Importance of the FOUNDATION TM Fieldbus Interoperability Test System
Stephen Mitschke Applications Engineer Fieldbus Foundation Practical Importance of the FOUNDATION TM Fieldbus Interoperability System Steve Vreeland Senior Software Engineer Fieldbus Inc. Austin, TX 78759
More informationTESTING. Overview Slide 6.2. Testing (contd) Slide 6.4. Testing Slide 6.3. Quality issues Non-execution-based testing
Slide 6.1 Overview Slide 6.2 Quality issues Non-execution-based testing TESTING Execution-based testing What should be tested? Testing versus correctness proofs Who should perform execution-based testing?
More informationClassroom Tips and Techniques: Least-Squares Fits. Robert J. Lopez Emeritus Professor of Mathematics and Maple Fellow Maplesoft
Introduction Classroom Tips and Techniques: Least-Squares Fits Robert J. Lopez Emeritus Professor of Mathematics and Maple Fellow Maplesoft The least-squares fitting of functions to data can be done in
More informationVERSION RELEASE NOTES... 2 VERSION RELEASE NOTES... 3 VERSION RELEASE NOTES... 5
Contents VERSION 6.3.3.4657 RELEASE NOTES... 2... 2... 2... 2 CC-BATCH... 2 VERSION 6.3.2.4389 RELEASE NOTES... 3... 3... 3... 3 CC-DYNAMICS... 4 CC-BATCH... 4 VERSION 6.3.1.4112 RELEASE NOTES... 5...
More informationIMAGE DE-NOISING IN WAVELET DOMAIN
IMAGE DE-NOISING IN WAVELET DOMAIN Aaditya Verma a, Shrey Agarwal a a Department of Civil Engineering, Indian Institute of Technology, Kanpur, India - (aaditya, ashrey)@iitk.ac.in KEY WORDS: Wavelets,
More informationCHAPTER 4. Numerical Models. descriptions of the boundary conditions, element types, validation, and the force
CHAPTER 4 Numerical Models This chapter presents the development of numerical models for sandwich beams/plates subjected to four-point bending and the hydromat test system. Detailed descriptions of the
More informationPreprocessing Short Lecture Notes cse352. Professor Anita Wasilewska
Preprocessing Short Lecture Notes cse352 Professor Anita Wasilewska Data Preprocessing Why preprocess the data? Data cleaning Data integration and transformation Data reduction Discretization and concept
More informationTransducers and Transducer Calibration GENERAL MEASUREMENT SYSTEM
Transducers and Transducer Calibration Abstracted from: Figliola, R.S. and Beasley, D. S., 1991, Theory and Design for Mechanical Measurements GENERAL MEASUREMENT SYSTEM Assigning a specific value to a
More information1. Estimation equations for strip transect sampling, using notation consistent with that used to
Web-based Supplementary Materials for Line Transect Methods for Plant Surveys by S.T. Buckland, D.L. Borchers, A. Johnston, P.A. Henrys and T.A. Marques Web Appendix A. Introduction In this on-line appendix,
More informationSimultaneous Validation of Online Analyzers and Process Simulators by Process Data Reconciliation
A publication of 1303 VOL. 32, 2013 CHEMICAL ENGINEERING TRANSACTIONS Chief Editors: Sauro Pierucci, Jiří J. Klemeš Copyright 2013, AIDIC Servizi S.r.l., ISBN 978-88-95608-23-5; ISSN 1974-9791 The Italian
More informationViewpoint Review & Analytics
The Viewpoint all-in-one e-discovery platform enables law firms, corporations and service providers to manage every phase of the e-discovery lifecycle with the power of a single product. The Viewpoint
More informationProceedings of Meetings on Acoustics
Proceedings of Meetings on Acoustics Volume 19, 2013 http://acousticalsociety.org/ ICA 2013 Montreal Montreal, Canada 2-7 June 2013 Underwater Acoustics Session 2aUW: Wave Propagation in a Random Medium
More informationCHAPTER 8 DISCUSSIONS
153 CHAPTER 8 DISCUSSIONS This chapter discusses the developed models, methodologies to solve the developed models, performance of the developed methodologies and their inferences. 8.1 MULTI-PERIOD FIXED
More informationDESIGN BASIS VERIFICATION AND PRESERVICE TESTING CONSIDERATIONS FOR OM CODE MANDATORY APPENDIX III
Proceedings of the ASME/NRC 2017 13th Pump and Valve Symposium PVS2017 July 17-18, 2017, Silver Spring, Maryland PVS2017-3504 DESIGN BASIS VERIFICATION AND PRESERVICE TESTING CONSIDERATIONS FOR OM CODE
More informationUNIT 2 Data Preprocessing
UNIT 2 Data Preprocessing Lecture Topic ********************************************** Lecture 13 Why preprocess the data? Lecture 14 Lecture 15 Lecture 16 Lecture 17 Data cleaning Data integration and
More informationBootstrapping Method for 14 June 2016 R. Russell Rhinehart. Bootstrapping
Bootstrapping Method for www.r3eda.com 14 June 2016 R. Russell Rhinehart Bootstrapping This is extracted from the book, Nonlinear Regression Modeling for Engineering Applications: Modeling, Model Validation,
More informationPetro-SIM Simulator and CAPE-OPEN: Experiences and Successes
Petro-SIM Simulator and CAPE-OPEN: Experiences and Successes Michael Aylott, KBC Advanced Technologies, Calgary, AB, Canada Ben van der Merwe, KBC Advanced Technologies, Calgary, AB, Canada Abstract KBC
More informationMonte Carlo method to machine tool uncertainty evaluation
Available online at www.sciencedirect.com ScienceDirect Procedia Manufacturing 13 (2017) 585 592 www.elsevier.com/locate/procedia Manufacturing Engineering Society International Conference 2017, MESIC
More informationNUMERICAL METHODS PERFORMANCE OPTIMIZATION IN ELECTROLYTES PROPERTIES MODELING
NUMERICAL METHODS PERFORMANCE OPTIMIZATION IN ELECTROLYTES PROPERTIES MODELING Dmitry Potapov National Research Nuclear University MEPHI, Russia, Moscow, Kashirskoe Highway, The European Laboratory for
More informationResearch on the raw data processing method of the hydropower construction project
IOP Conference Series Earth and Environmental Science PAPER OPEN ACCESS Research on the raw data processing method of the hydropower construction project To cite this article Zhichao Tian IOP Conf. Ser.
More informationMathematics Background
Finding Area and Distance Students work in this Unit develops a fundamentally important relationship connecting geometry and algebra: the Pythagorean Theorem. The presentation of ideas in the Unit reflects
More informationA New Tool Providing an Integrated Framework for Process Optimization
EngOpt 2008 - International Conference on Engineering Optimization Rio de Janeiro, Brazil, 01-05 June 2008. A New Tool Providing an Integrated Framework for Process Optimization E.C. do Valle; R.P. Soares;
More informationGeneral properties of staircase and convex dual feasible functions
General properties of staircase and convex dual feasible functions JÜRGEN RIETZ, CLÁUDIO ALVES, J. M. VALÉRIO de CARVALHO Centro de Investigação Algoritmi da Universidade do Minho, Escola de Engenharia
More information3. Data Preprocessing. 3.1 Introduction
3. Data Preprocessing Contents of this Chapter 3.1 Introduction 3.2 Data cleaning 3.3 Data integration 3.4 Data transformation 3.5 Data reduction SFU, CMPT 740, 03-3, Martin Ester 84 3.1 Introduction Motivation
More informationChapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea
Chapter 3 Bootstrap 3.1 Introduction The estimation of parameters in probability distributions is a basic problem in statistics that one tends to encounter already during the very first course on the subject.
More information2. Data Preprocessing
2. Data Preprocessing Contents of this Chapter 2.1 Introduction 2.2 Data cleaning 2.3 Data integration 2.4 Data transformation 2.5 Data reduction Reference: [Han and Kamber 2006, Chapter 2] SFU, CMPT 459
More informationFUZZY SOFT SENSORS FOR CHEMICAL AND OIL REFINING PROCESSES. Natalia Bakhtadze, Evgeny M. Maximov, Ravil T. Valiakhmetov
roceedings of the 7th World Congress The International Federation of Automatic Control Seoul, Korea, July 6-, 28 FUZZY SOFT SENSORS FOR CHEMICAL AND OIL REFINING ROCESSES Natalia Bakhtadze, Evgeny M. Maximov,
More informationINTEGRATION OF CAMPAIGN SCHEDULING, DYNAMIC OPTIMIZATION AND OPTIMAL CONTROL IN MULTI-UNIT BATCH PROCESSES
INTEGRATION OF CAMPAIGN SCHEDULING, DYNAMIC OPTIMIZATION AND OPTIMAL CONTROL IN MULTI-UNIT BATCH PROCESSES F. Rossi a,b *, G. Reklaitis a, F. Manenti b, G. Buzzi-Ferraris b a Purdue University, Forney
More information1314. Estimation of mode shapes expanded from incomplete measurements
34. Estimation of mode shapes expanded from incomplete measurements Sang-Kyu Rim, Hee-Chang Eun, Eun-Taik Lee 3 Department of Architectural Engineering, Kangwon National University, Samcheok, Korea Corresponding
More informationEnhanced function of standard controller by control variable sensor discredibility detection
Proceedings of the 5th WSEAS Int. Conf. on System Science and Simulation in Engineering, Tenerife, Canary Islands, Spain, December 16-18, 2006 119 Enhanced function of standard controller by control variable
More informationParameter Estimation and Model Order Identification of LTI Systems
Preprint, 11th IFAC Symposium on Dynamics and Control of Process Systems, including Biosystems Parameter Estimation and Model Order Identification of LTI Systems Santhosh Kumar Varanasi, Phanindra Jampana
More informationSTATISTICAL CALIBRATION: A BETTER APPROACH TO INTEGRATING SIMULATION AND TESTING IN GROUND VEHICLE SYSTEMS.
2016 NDIA GROUND VEHICLE SYSTEMS ENGINEERING and TECHNOLOGY SYMPOSIUM Modeling & Simulation, Testing and Validation (MSTV) Technical Session August 2-4, 2016 - Novi, Michigan STATISTICAL CALIBRATION: A
More informationTechnical Report of ISO/IEC Test Program of the M-DISC Archival DVD Media June, 2013
Technical Report of ISO/IEC 10995 Test Program of the M-DISC Archival DVD Media June, 2013 With the introduction of the M-DISC family of inorganic optical media, Traxdata set the standard for permanent
More informationAdequacy Testing of Some Algorithms for Feedforward Control of a Propane-propylene Distillation Process
Adequacy Testing of Some Algorithms for Feedforward Control of a Propane-propylene Distillation Process NICOLAE PARASCHIV*, EMIL PRICOP* Petroleum Gas University of Ploiesti,, Control Engineering, Computers
More informationSnell s Law. Introduction
Snell s Law Introduction According to Snell s Law [1] when light is incident on an interface separating two media, as depicted in Figure 1, the angles of incidence and refraction, θ 1 and θ 2, are related,
More informationFathom Dynamic Data TM Version 2 Specifications
Data Sources Fathom Dynamic Data TM Version 2 Specifications Use data from one of the many sample documents that come with Fathom. Enter your own data by typing into a case table. Paste data from other
More informationCT Systems and their standards
CT Systems and their standards Stephen Brown Engineering Measurement 11 th April 2012 Industrial X-ray computed tomography: The future of co-ordinate metrology? Burleigh Court, Loughborough University
More information