Lecture: Simulation. of Manufacturing Systems. Sivakumar AI. Simulation. SMA6304 M2 ---Factory Planning and scheduling. Simulation - A Predictive Tool

Similar documents
Overview of the Simulation Process. CS1538: Introduction to Simulations

Numerical approach estimate

8.1 Model building, verification, and validation

Analysis of Simulation Results

Modeling and Simulation (An Introduction)

Discrete-Event Simulation: A First Course. Steve Park and Larry Leemis College of William and Mary

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

Simulation Models for Manufacturing Systems

Simulation Modeling and Analysis

Simulation. Outline. Common Mistakes in Simulation (3 of 4) Common Mistakes in Simulation (2 of 4) Performance Modeling Lecture #8

Modeling and Performance Analysis with Discrete-Event Simulation

Computer Systems Performance Analysis and Benchmarking (37-235)

Simulation with Arena

Simulative Evaluation of Internet Protocol Functions

ENM316E Simulation. The output obtained by running the simulation model once is also random.

Slides 11: Verification and Validation Models

Microscopic Traffic Simulation

SIMULATION AND MONTE CARLO

Chapter 3 General Principles. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

A New Statistical Procedure for Validation of Simulation and Stochastic Models

Fast Automated Estimation of Variance in Discrete Quantitative Stochastic Simulation

System dynamic (SD) modeling. Lisa Brouwers

Engineering 1000 Chapter 6: Abstraction and Modeling

Verification and Validation of X-Sim: A Trace-Based Simulator

Chapter 10 Verification and Validation of Simulation Models. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

Chapter 16. Microscopic Traffic Simulation Overview Traffic Simulation Models

You ve already read basics of simulation now I will be taking up method of simulation, that is Random Number Generation

Discrete-event simulation

What We ll Do... Random

Performance Evaluation. Recommended reading: Heidelberg and Lavenberg Computer Performance Evaluation IEEETC, C33, 12, Dec. 1984, p.

Chapter 1. Introduction

Design of Experiments

Introduction to Simulation

Computer Experiments. Designs

Computational Methods. Randomness and Monte Carlo Methods

Lecture 5: Performance Analysis I

BUFFER STOCKS IN KANBAN CONTROLLED (TRADITIONAL) UNSATURATED MULTI-STAGE PRODUCTION SYSTEM

An algorithm for Performance Analysis of Single-Source Acyclic graphs

Why is Statistics important in Bioinformatics?

Microscopic Traffic Simulation

ACCURACY AND EFFICIENCY OF MONTE CARLO METHOD. Julius Goodman. Bechtel Power Corporation E. Imperial Hwy. Norwalk, CA 90650, U.S.A.

A Capacity Planning Methodology for Distributed E-Commerce Applications

iscrete-event System Simulation of Queues with Spreadsheets Combined with Simple VBA Code: A Teaching Case

General Simulation Principles

Joe Wingbermuehle, (A paper written under the guidance of Prof. Raj Jain)

GLM II. Basic Modeling Strategy CAS Ratemaking and Product Management Seminar by Paul Bailey. March 10, 2015

Perspectives on Network Calculus No Free Lunch but Still Good Value

Queuing Systems. 1 Lecturer: Hawraa Sh. Modeling & Simulation- Lecture -4-21/10/2012

Communication Networks Simulation of Communication Networks

CONFIDENCE INTERVALS FROM A SINGLE SIMULATION USING THE DELTA METHOD 1

Scientific Computing: An Introductory Survey

Introduction to Computational Mathematics

An experiment to show undergraduate students the impact of transient analysis decisions on parameter estimation for non-terminating simulated systems

Introduction to Queueing Theory for Computer Scientists

nalysis, Control, and Design of Stochastic Flow Systems Limited Storage

Analytic Performance Models for Bounded Queueing Systems

Probability Models.S4 Simulating Random Variables

Performance evaluation and benchmarking of DBMSs. INF5100 Autumn 2009 Jarle Søberg

CSE 446 Bias-Variance & Naïve Bayes

Contemporary Design. Traditional Hardware Design. Traditional Hardware Design. HDL Based Hardware Design User Inputs. Requirements.

Practical Design of Experiments: Considerations for Iterative Developmental Testing

Introduction to the course

Advanced Internet Technologies

SIMULATION OF A SINGLE-SERVER QUEUEING SYSTEM

4.5 The smoothed bootstrap

Chill Out: How Hot Objects Cool

DECISION SCIENCES INSTITUTE. Exponentially Derived Antithetic Random Numbers. (Full paper submission)

Bioinformatics: Network Analysis

Simulation Input Data Modeling

Mechanical Engineering 101

Markov Chains and Multiaccess Protocols: An. Introduction

Heteroskedasticity and Homoskedasticity, and Homoskedasticity-Only Standard Errors

STATISTICS (STAT) Statistics (STAT) 1

1. Assumptions. 1. Introduction. 2. Terminology

Chapter 3. Bootstrap. 3.1 Introduction. 3.2 The general idea

Cpk: What is its Capability? By: Rick Haynes, Master Black Belt Smarter Solutions, Inc.

Reproducibility in Stochastic Simulation

The JMT Simulator for Performance Evaluation of Non-Product-Form Queueing Networks

1.2 Numerical Solutions of Flow Problems

Introduction to Software Testing

Modeling of Complex Social. MATH 800 Fall 2011

Subject Name: System Modeling & Simulation. Prepared By : Ms. Pramela Devi( ASCP) Department : Computer Science & Engineering

MLR Institute of Technology

CPIB SUMMER SCHOOL 2011: INTRODUCTION TO BIOLOGICAL MODELLING

UNIT 4: QUEUEING MODELS

Time-Step Network Simulation

Modelling and Quantitative Methods in Fisheries

7/27/2015. UNIT 10A Discrete Simulation. Last Time. Waking Up. How to generate pseudo-random numbers. Using randomness in interesting applications

Lesson 08 Linear Programming

COPULA MODELS FOR BIG DATA USING DATA SHUFFLING

A noninformative Bayesian approach to small area estimation

ENTITIES WITH COMBINED DISCRETE-CONTINUOUS ATTRIBUTES IN DISCRETE- EVENT-DRIVEN SYSTEMS

SOFT 437. Software Performance Analysis. Ch 7&8:Software Measurement and Instrumentation

Introduction to Mixed Models: Multivariate Regression

Learner Expectations UNIT 1: GRAPICAL AND NUMERIC REPRESENTATIONS OF DATA. Sept. Fathom Lab: Distributions and Best Methods of Display

10.4 Linear interpolation method Newton s method

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16.

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu

Inclusion of Aleatory and Epistemic Uncertainty in Design Optimization

Chapter 8 Visualization and Optimization

Transcription:

SMA6304 M2 ---Factory Planning and scheduling Lecture Discrete Event of Manufacturing Systems Simulation Sivakumar AI Lecture: 12 copyright 2002 Sivakumar 1 Simulation Simulation - A Predictive Tool Next job Next Task Schedules Plans Data Manufacturing Simulation - A decision support Scheduling System Engine SIMULATOR copyright 2002 Sivakumar 2 1

Lecture (12 December 2002) Simulation 1. Introduction 2. Input probability distributions 3. Generating Random Numbers & Random Variates 4. Verification and Validation 5. Output data analysis 6. Simulation life cycle References Law, A. M., Kelton, W. D. Simulation Modeling and Analysis, McGr aw -Hill Banks, J. Handbook of Simulation, EMP copyright 2002 Sivakumar 3 1. Introduction copyright 2002 Sivakumar 4 2

The Meaning of Simulation Oxford English Dictionary Simulation: The technique of imitating the behavior of some situation or system (Manufacturing, etc) by means of an analogous situation, model or apparatus, either to gain information more conveniently or.. Simulator: An apparatus or system for reproducing the behavior of some situation or system;.., and gives the illusion... of behaving like the real thing. copyright 2002 Sivakumar 5 Model: System Modeling A simplified or idealized description of a system, situation, or process, often in mathematical terms, devised to facilitate calculations and predictions a representation of an object, system or idea in a form other than that of the entity / system itself. an abstraction and simplification of the real world. copyright 2002 Sivakumar 6 3

System Modeling -Functions of models As an analytical tool Analyze manufacturing systems Evaluating equipment requirements Design transportation facility Ordering policy for an inventory system As an aid for experimentation For planning and scheduling as an aid to thought as an aid to communicating for education and training copyright 2002 Sivakumar 7 Physical models copyright 2002 Sivakumar 8 Classification of models analog models of continuous systems e.g. traffic flow. iconic models e.g. pilot training simulators. Analytical/Mathematical model Most scheduling systems Representing a system in terms of quantitative relationships. Static simulation models Time does not play a role; e.g. Monte Carlo simulation. Conventional simulation models System as it evolves over time - therefore it is dynamic Have I/O and internal structure; run rather than solved. Empirical Stochastic, (can be deterministic for scheduling application). Online simulation models As conventional; but near -real-time; useful for decision support 4

Classification of simulation models Deterministic vs. Stochastic Simulation models If no probabilistic components then it is deterministic If random input components used then it is Stochastic. Continuous vs. Discrete-event Simulation models Discrete-event simulation concerns modeling a system as it evolves over time by a representation where state variables chan ge instantaneously at the event Continuous simulation covers modeling over time by a representation where state variables change continuously with respect to time (e.g. using differential equations) Combined Discrete-Continuous Simulation For systems that are neither completely discrete nor completely continuos e.g arrival of a tanker and filling it. copyright 2002 Sivakumar 9 Simulation models Manual model generation Conventional Simulation Model For analytical work For decisions or confirmation of decisions Rapid model building Manually intensive Hard to maintain Auto model generation Dynamic (near-real-time) Simulation Model Most suitable for planning and scheduling Uses near-real -time data Fully automatic Integrated with info. systems No direct maintenance copyright 2002 Sivakumar 10 5

System state variables Some basic definitions Collection of information needed to define what is happening in a system to a sufficient level at given point in time Events Exogenous e.g. order arrival; Endogenous e.g a machine down Entities and attributes Dynamic entity e.g. a customer, Static entity e.g. a machine. An entity is defined by its attributes, e.g. quantity of a lot Resources Resource is a static entity that provides service to dynamic entity (a lot) Activity and delay Activity is a period whose duration is known; Delay is an indefinite duration caused by a combination of systems conditions copyright 2002 Sivakumar 11 Four types of modeling structures Event-Scheduling method Events are scheduled by advancing the simulation clock exactly to the time of the next event. This is one of the most accurate structures. Activity Scanning method Three-Phase method Process interaction method copyright 2002 Sivakumar 12 6

Example: M/M/1 queue i Server (machine) An arriving job (customer) (IID random arrival) Jobs (customer) in queue) Job being processed (customer in service) A completed job (a departing customer) copyright 2002 Sivakumar 13 Next-event time advance mechanism in Event-Scheduling method of an M/M/1 queue t i = Time of arrival of i th customer A i = t i - t i-1 = Inter arrival time = IID random variables S i = Service time of i th customer= IID ran. variable D i = Observed delay of i th customer in queue c i = t i + D i + S i = Completion time of i th customer e i = Time of occurrence of i th event of any type B t (or d t ) = Busy function defined as 1 if server is busy at time t and 0 if server is idle at time t copyright 2002 Sivakumar 14 7

Next-event time advance mechanism in Event-Scheduling method of an M/M/1 queue Inter arrival times A i, and service times S i have cumulative distribution functions F a and F s, which are determined by collections of actual past data and fitting distributions. (2. Input probability distributions) Each value of t i is computed using generated values of A i, using random observations from a specific distribution. (3. Generating Random Numbers and Random Variates) copyright 2002 Sivakumar 15 Next-event time advance mechanism in Event- Scheduling method of an M/M/1 queue Real time clock Simulation clock e 0 e 1 e 2 e 3 e 4 e 5 0 t 1 t 2 c 1 t 3 c 2 A 1 A 2 A 3 S 1 S 2 copyright 2002 Sivakumar 16 8

Next-event time advance mechanism in Event-Scheduling method of an M/M/1 queue The simulation clock is advanced from each event to the next event based on the event times of the event list. A machine starting to process is an activity and the end time is known from S i & F a When a customer (or a job) arrives and if the server is busy then it joins a queue and the end time is unknown. This is a delay copyright 2002 Sivakumar 17 Discrete event simulation of an M/M/1 queue Job Arrival Service Start End Delay t i S i c i D i J1 0.4 2.0 0.4 2.4 0 J2 1.6 0.7 2.4 3.1 0.8 J3 2.1 0.2 3.1 3.3 1.0 J4 3.8 1.1 3.8 4.9 0 J5 4.0 3.7 4.9 8.6 0.9 J6 5.6 2.8 J7 5.8 1.6 J8 7.2 3.1 copyright 2002 Sivakumar 18 9

B (t) Discrete event simulation of an M/M/1 queue 1 0 Q (t) 3 2 1 0 Arrivals e 2=1.6 e 1=0.4 e 3=2.1 e 8=4.0 e 7=3.8 e 11=5.8 e 10=5.6 e 12=7.2 Time Departures copyright 2002 Sivakumar 19 e 4=2.4 e 6=3.3 e 13=8.6 e5=3.1 e 5=4.9 Expected average delay in M/M/1 queue From a single run of simulation of n jobs (customers), a point estimate for D ( n), expected average delay in queue of the n jobs (customers) is D ( n ) n = i = 1 D i n copyright 2002 Sivakumar 20 10

Expected average number of jobs in M/M/1 queue T i = Total time during the simulation in which of customers(jobs) is observed as length i the queue T n = Time to observe n delays in queue Q (t) = Number of customers in queue at time t q n = Average number of customers(jobs) in queue during n observations q ( n ) = it i T n i = 1 copyright 2002 Sivakumar 21 Expected average number of jobs in M/M/1 queue Since = i 1 We get it i = T ( n ) 0 q n) Q ( t) dt T ( n ) Q (t)dt ( = 0 n T copyright 2002 Sivakumar 22 11

Expected utilization of the machine in M/M/1 system u n = Expected proportion of time the server (machine) is busy during n observations Since B t is always either 0 or 1 u ( n ) = T ( n ) 0 B ( t) dt T n copyright 2002 Sivakumar 23 Simple output from discrete event simulation Let take a case of the earlier table of arrival and service with the first 5 observation of completion (T n =8.6) i = 1 i T i = (0 X 3.2) + (1 X 2.3 ) + (2 X 1.7) + (3 X 1.4) = 9.9 Ti = 0 for i >= 4 and therefore q ( n) = 9.9/8.6 = 1.15 and u ( n) = [(3.3-0.4) + (8.6-3.8) ]/8.6 = 0.90 copyright 2002 Sivakumar 24 12

Discrete event simulation These values are simple illustrations of statistics of discrete event simulation Discrete-time statistics (e.g. average delay in queue) or Continuous -time statistics (e.g. proportion of server busy time) A very large number of other useful statistics could be obtained from each simulation run. In addition very complex manufacturing systems could be modeled using this simple approach. copyright 2002 Sivakumar 25 Discrete event simulation However model MUST be verified and validated to add credibility to the results. (4. verification and validation) Experimental runs should then be carried out using the validated model. copyright 2002 Sivakumar 26 13

Discrete event simulation However values of each experimental run are based on sample size of 1 (one complete simulation run) and sample size of 1 is not statistically useful. Multiple replications and confidence interval are therefore essential elements of simulation output data analysis. (5. Output data analysis) copyright 2002 Sivakumar 27 Probability & statistics and Simulation Probability and statistics are integral part of simulation study Need to understand how to model a probabilistic system Validate a simulation model Input probability distributions, Generate / use random samples from these distributions Perform statistical analysis of output data Design the simulation experiments. copyright 2002 Sivakumar 28 14

2. Input probability distributions copyright 2002 Sivakumar 29 Randomness in Manufacturing Process time MTTR MTTF Inter arrival time Job types or part mix Yield Rework Transport time Setup time and so on copyright 2002 Sivakumar 30 15

Using past data Use past data directly in simulation. This is trace driven simulation. (Effective for model validation) Use the sample data to define an empirical distribution function and sample the required input data from this distribution. Use a standard technique (e.g. regression) to fit a theoretical distribution form to the sample data (e.g. exponential etc.), and sample the required data from it. copyright 2002 Sivakumar 31 Steps in selecting Input probability distributions Assess sample independence: Must confirm the observations X1,X2, Xn are independent using techniques such as correlation plot. Hypothesizing families of distributions: Without concern for specific parameters, we must select general family e.g. normal, exponential etc. Estimation of parameters: Use the past numerical data to estimate parameters. Determine the best fit: Use a technique such as probability plot or chi -square test and identify the most suitable distribution function The last three steps are integral part of available software and therefore we may not have to manually carryout these steps copyright 2002 Sivakumar 32 16

3. Generating Random Numbers and Random Variates copyright 2002 Sivakumar 33 Status Early simulation studies required random number generation and generation of random variates from the distributions, often manually coded in computers. Most of the current simulation languages and simulators have built in features for this. copyright 2002 Sivakumar 34 17

Random number generation for simulation Built in feature should have the following: Generate random numbers, uniformly distributed on U[0,1] that do not exhibit any correlation with each other. Reproduce a given stream of random numbers exactly (i.e. Identical random numbers) for verification etc. Have ability to generate large number of streams for multiple replications (i.e. Different streams are separate and independent generators.) copyright 2002 Sivakumar 35 Random variate generation for simulation Built in feature should also have the following: Generate random variates. This means: Produce observations for each selected variable (e.g. MTTR) from the parameters of the desired distribution function (e.g. Gamma) using the IID U(0,1) random numbers with computational efficiency. copyright 2002 Sivakumar 36 18

4. Verification and Validation copyright 2002 Sivakumar 37 Model verification: Definition Building model right Correct translation of conceptual simulation model in to a working program Debugging Model validation: Building the right model Determine if the conceptual simulation model is an accurate representation. Credible Model: Objectives fulfilled using model When the model is accepted by user / client and used for the purpose it was built copyright 2002 Sivakumar 38 19

Errors arise from data, conceptual model, computer model, even computer system. Test sub-models first, then complete model. Common techniques Verification Static: a structured, walk -through technique. Dynamic: run program under different conditions, then check if the output is reasonable. Trace: Identify selected state variables (event list) after each event and check with manual calculations Animation: observe animation copyright 2002 Sivakumar 39 What is Validation? Valid if it s output behavior is sufficiently accurate to fulfill the purpose. Absolute accuracy is not essential and it is too time-consuming and expensive. check underlying theories, assumptions, approximation. check model structure and logic, math and relationships (by tracing entities in all sub-models and main model). Model should be validated relative to the measures in the objective. copyright 2002 Sivakumar 40 20

Validation Data have to be validated: difficult, time-consuming and costly to obtain relevant, sufficient and most importantly consistent and accurate factory data. copyright 2002 Sivakumar 41 A three step Validation process Conventional simulation studies : Step 1. Face validation: ask people knowledgeable / experienced about the system under study Step 2. Empirically Test & compare with other models, e.g. analytical models Step 3. Detail output data validation (a) Confidence Intervals (b) Correlated Inspection Approach copyright 2002 Sivakumar 42 21

Confidence Intervals Let Y 1, Y 2, Y n, be IID random variables with mean sample variance s 2 ) (n and It can be shown using central limit theorem that, when n is sufficiently large, that an approximate 100(1-a ) percent confidence interval ( assuming a t distribution with (n-1) degrees of freedom) is given by the following: y n l (n, ) a = U ( n) - n-1,(1- a / 2) t S 2 (n) n Shaded area = (1-a ) u (n, ) a = U ( n) + n -1,(1- a copyright 2002 Sivakumar 43 t / 2) 2 S (n) n t - n -1,(1- a /2) + n-1,(1-a /2) t Validation: Confidence Intervals approach Assume we collect m independent sets of data from the system and n independent sets of data from the model. Let X j be the average of the observations of a desired variable (e.g. throughput) in the j th set of system data and U j be the average of the observations in the j th set of model data of the same variable. copyright 2002 Sivakumar 44 22

Validation: Confidence Intervals approach If we assume the m sets of system data are homogeneous, then X j s are IID random variables with the mean µ x. If the n sets of simulation model data were generated using independent replications then U j s are IID random variables with the mean µ y. copyright 2002 Sivakumar 45 Validation: Confidence Intervals approach One of the methods to compare the model with the system is by constructing a If confidence interval for x where upper confidence interval end points of between significant at level copyright 2002 Sivakumar 46 x = Let l (n, a ) and u (n, a ) correspond to lower & 0 ˇ [l( a ),u( a m x m x - )], then the observed difference and m y a m is said to be statistically x y. 23

Validation: Confidence Intervals approach However, even though the difference is statistically significant, the model may still be a valid representation of the system to fulfill the objectives of the simulation. On the other hand if observed difference between said to be statistically not significant at level and may be explained by sampling fluctuations. copyright 2002 Sivakumar 47 0 [l( a ),u( a m x )], then the and m y is a Validation : Correlated Inspection Approach Statistics of the desired variable from the systems is compared with corresponding statistics of the model. Historical data from factory Historical data from factory Actual system (i.e., factory) Simulation model Output data from factory Compare Model output data copyright 2002 Sivakumar 48 24

Validation : Correlated Inspection Approach Suppose we want to validate the cycle time (e.g.makespan / duration from arrival to completion). We make a number of observation of the factory cycle time X j s and for example, the inter arrival time of the jobs. We then use the observed values of the inter arrival time of the factory to drive the simulation model and obtain cycle time values U j s from it. copyright 2002 Sivakumar 49 Validation : Correlated Inspection Approach We compare j th set of system data X j and simulation model data U j where estimate for m x - is an j We can look at the sample mean and sample variance of all to make x - a yjudgment on the validity of the simulation model to fulfill the objectives. m y m m X j - Y copyright 2002 Sivakumar 50 25

5. Output data analysis copyright 2002 Sivakumar 51 Transient and steady state behavior Experimental observations for output analysis should be made at steady state, i.e. after a transient state of the stochastic simulation run. Consider the output of the random variable U i for i=1,2, m. Let F i copyright 2002 Sivakumar 52 P (Y i ( y I) = where y is a real number and I represents initial condition. The transient distribution is different for each i and each set of I. y I F i ) ( y I) at discrete time i 26

If Transient and steady state behavior F ( y I) fi F ( y ) i fi i F ( y ) Initial condition I then state distribution. all y and any is said to be steady SS state =E(Y ) Y i2 Y i3 Transient state Steady state (not necessarily normal density) E(Y i ) Y i1 i i 1 2 i 3 copyright 2002 Sivakumar 53 Random nature of simulation output Let U 1, U 2, U i U m, be output of stochastic process from a single simulation run (e.g. U i is th throughput at i hour, m = number of observations ). U i s are random variables & generally not IID. If we carry out n independent replication (using n different streams) then we realize a set of random observations as shown below: y 11,..., y 1i,... y 1m y 21,..., y 2i,... y 2m............ y n1,..., y ni,... y nm copyright 2002 Sivakumar 54 27

Output analysis Observations from a single replication (row) are not IID. However y 1i, y 2i,...y ni from n replications (column) are IID observations of the random variable, U i for i=1,2, m. This is the basis of statistical analysis of the observations of y ji. estimate of copyright 2002 Sivakumar 55 E (Y ) i For example an unbiased y i ( n ) n j = 1 = y n ji Confidence Interval of Output analysis - an example Comparing two systems on a given measure of performance is done by forming a confidence interval for the difference in two expected values of for example and check it for. 0 E (Y j ) If the number of observations n1=n2=n then we pair with Y 1 jand define Y 2 j Z j for = Yj= 1j 1,2, n - Y2and j s are the IID random Zj variables. Approximate percent confidence interval 100 is: (1 - a ) 2 S (n) Z ( n ) t n-1,(1 - a / 2) n Paired-t confidence interval copyright 2002 Sivakumar 56 28

Steps in conventional simulation Real World System Abstraction Simulation Model Experimentation Implementation Recommendations Interpretation Formal Results copyright 2002 Sivakumar 57 Credibility Assessment in Simulation projects copyright 2002 Sivakumar 58 29

Credibility Assessment Stages Quality Control of processes between phases Credibility assessment will always be subjective because modeling is an art credibility assessment is situationdependent Accuracy of assessment always relative to the objective of simulation, never absolute. copyright 2002 Sivakumar 59 Credibility Assessment Stages Quality Control of processes between phases Peer Assessment Panel of persons who are experts of the system under study expert modelers simulation analysts familiar with simulation projects copyright 2002 Sivakumar 60 30

Credibility Assessment Stages Quality Control of processes between phases Verify formulated problem to make sure it faithfully reflects the real problem. Feasibility of simulation is data available? Easy or costly to get? are resources for simulation available? cost-benefit: any time limit imposed to complete study? The real system are system s boundaries well-defined? have objectives of simulation changed with time? counter-intuitive behavior accounted for? any drift to low performance? copyright 2002 Sivakumar 61 Credibility Assessment Stages Quality Control of processes between phases Qualifying the conceptual model are assumptions explicitly defined, appropriate? Verifying the communicative model use techniques such as walk -through, structural analysis, data-flow analysis (Whitner & Balci, 1986). Verifying the programmed model use standard software verification techniques. copyright 2002 Sivakumar 62 31

Quality Control of processes between phases Verify experimental design is random number generator accurate and true? are statistical techniques for design and analysis of experiments appropriate? initial transients accounted for? have you ensured identical experimental conditions for each alternative operating policy? Data validation (of model parameters and input data) are they appropriate? current? unbiased? inter-dependent? complete? accurate? are instruments for data measurement and collection accurate? copyright 2002 Sivakumar 63 Credibility Assessment Stages Credibility Assessment Stages Quality Control of processes between phases Validating the Experimental Model always compare the behavior of the model and real system under identical input conditions. subjective and statistical validation techniques applicable only when data are completely observable. Interpretation of simulation results interpret numerical results based on the objective of the study. Judgment involved. Documentation Embed documentation into model development cycle. copyright 2002 Sivakumar 64 32

Presentation Credibility Assessment Stages Quality Control of processes between phases communicating simulation results translate the jargon so non-simulation people & decision-makers can understand presentation techniques integrate simulation results with a DSS, so decision-maker can appreciate the significance of the simulation results. copyright 2002 Sivakumar 65 Wrap-up We have looked at the simulation of a M/M/1 queue We have discussed Input probability distributions We talked about the Random Numbers and Random Variates We discussed Validation techniques We outlined output data analysis and confidence intervals Life cycle and Credibility Assessment of simulation models copyright 2002 Sivakumar 66 33