Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Techniques for Verifying the Accuracy of Simulation Models - Prof. Kaur, Exercises of Software Engineering

Various techniques for verifying the accuracy of simulation models, including validation and verification methods, statistical comparison approaches, and programming practices. It emphasizes the importance of ensuring a simulation model is a valid representation of the actual system being studied.

Typology: Exercises

2019/2020

Uploaded on 10/03/2022

amit-suthar
amit-suthar 🇮🇳

1 document

1 / 26

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
1
Validation and Verification
of Simulation Models
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a

Partial preview of the text

Download Techniques for Verifying the Accuracy of Simulation Models - Prof. Kaur and more Exercises Software Engineering in PDF only on Docsity!

Validation and Verification

of Simulation Models

Outline

Introduction

 Definitions of validation and verification

 Techniques for verification of simulation models

 Techniques for validation of simulation models

 Statistical Methods for Comparing real-world

observations with simulation output data

 Inspection Approach

 Confidence-Interval Approach

Summary

Steps in Simulation Study

1. Formulate the problem and plan the study

2. Collect data and define a model

3. Check for validity

4. Write a computer program and verify

5. Make a pilot run

6. Check for validity

7. Design the experiments

8. Make production runs

9. Analyze output data

10. Document, present, implement results

Building Simulation Models System Conceptual model Simulatio n program “Correct” results available Results implemente d Analysis & data Steps: 1,2, Programming Step: 4 Make model run Steps: 5- Sell results Step: 10 Validation Verification Validation Establish credibility

Validation, Verification

 Validation is concerned with determining whether the

conceptual simulation model itself is accurate

representation of the system under study.

 If the model is “valid,” the conclusions drawn from it

should be similar to the ones made by physically

experimenting with the system.

 When the simulation model and its results are accepted by

the manager/client as valid, and are used to make

managerial decisions, the model is said to be credible.

Validation of Simulation model

 Goal of validation is to ensure that the simulation model is

good enough so that it can be used to make decisions about

the systems that we ideally would like to work with.

 Ease or difficulty of the validation process depends on the

complexity of the system being modeled and on whether a

version of a system currently exists.

 A simulation model of a complex system can only be an

approximation to the actual system, regardless of how

much effort is put into development. There is no such thing

as absolutely valid model!

 A simulation model is always developed for a particular

purpose.

What are Validation and Verification?  (^) Validation is the process of determining whether the conceptual model is an accurate representation of the actual system being analyzed. Validation deals with building the right model.  (^) Verification is the process of determining whether a simulation computer program works as intended (i.e., debugging the computer program). Verification deals with building the model right. Real -World System Conceptual Model Simulation Program Validation Verificat ion Validation

Techniques for Verification of

Simulation Models

Use good programming practice:

 (^) Write and debug the computer program in modules or subprograms.  (^) In general, it is always better to start with a “moderately detailed” model, and later embellish, if needed.

 Use “structured walk-through”:

 (^) Have more than one person to read the computer program.

 Use a “trace”:

 (^) The analyst may use a trace to print out some intermediate results and compare them with hand calculations to see if the program is operating as intended.

 Compare final simulation output with analytical

results:

 (^) May verify the simulation response by running a simplified version of the simulation program with a known analytical result. If the results of the simulation do not deviate significantly from the known mean response, the true distributions can then be used.  (^) For example, for a queuing simulation model, queuing theory can be used to estimate steady state responses (e.g., mean time in queue, average utilization). These formulas, however, assume exponential interarrival and service times with n servers (M/M/n).

Techniques for Verification of

Simulation Models

Validation of Simulation Models

As an aid in the validation process, Naylor

and Finger formulated a three-step

approach which has been widely

followed:

1. Build a model that has high face validity.

2. Validate model assumptions.

3. Compare the model input-output

transformations to corresponding input-

output transformations for the real

system.

  1. Test the assumptions of the model empirically:  (^) In this step, the assumptions made in the initial stages of model development are tested quantitatively. For example, if a theoretical distribution has been fitted to some observed data, graphical methods and goodness of fit tests are used to test the adequacy of the fit.  (^) Sensitivity analysis can be used to determine if the output of the model significantly changes when an input distribution or when the value of an input variable is changed. If the output is sensitive to some aspect of the model, that aspect of the model must be modeled very carefully.

Techniques for Validation of Simulation

Models

  1. Determine how representative the simulation output data are:  (^) The most definitive test of a model’s validity is determining how closely the simulation output resembles the output from the real system.  (^) The Turing test can be used to compare the simulation output with the output from the real system. The output data from the simulation can be presented to people knowledgeable about the system in the same exact format as the system data. If the experts can differentiate between the simulation and the system outputs, their explanation of how they did that should improve the model.

Techniques for Validation of Simulation

Models

2 approaches for comparing the outputs

from the real-world system with the

simulation outputs are:

 Inspection Approach

 Confidence-Interval Approach

Statistical Methods for Comparing Real-World Observation With Simulation Output Data

Run the simulation model with historical system

input data (e.g., actual observed inter-arrival and

service times) instead of sampling from the input

probability distributions, and compare the system

and model output data.

 Use system and the model experience, exactly for

the same observations, taking input from random

variables.

 This approach results in model and system

outputs being positively correlated.

Inspection Approach