Steps in Simulation Study



Problem formulation

  • Every study starts with a problem statement provided by policymakers. The analyst double-checks that everything is clear. Policymakers should comprehend and agree with it if it is prepared by analysts.

Setting of objectives and overall project plan

  • The objectives specify the questions that will be addressed by simulation. At this point, a decision should be taken about whether simulation is the best method to use.

Assuming it is appropriate, the overall project plan should include:

  • A statement of the alternative systems

  • A mechanism for evaluating the effectiveness of these alternatives 

  • Study plans, including the number of persons involved 

  • Study cost 

Model conceptualization

  • The creation of a system model is likely as much art as science.

  • An ability enhances the art of modeling.

  • To extract the most important aspects of a problem.

  • Choosing and modifying the system's fundamental assumptions.

  • To enhance and elaborate the model until it yields a reasonable approximation.

As a result, it's advisable to start with a small model and work your way up. Model conceptualization improves the quality of the final model and increases the model user's trust in its use.

Data collection

  • The building of the model and the gathering of required input data are always in motion. Done in the early stages of the project.

Model translation

  • Models based on real-world systems necessitate a large amount of data storage and calculation. Simulation languages or specific purpose simulation software can be used to programme it.

  • Simulation languages are versatile and powerful. The time it takes to design simulation software models can be lowered.

Verified

  • It has to do with the computer software and how it performs. Verification is accomplished when the input parameters and logical structure are appropriately represented.

Validated

  • It is the determination of whether or not a model accurately represents the real system. Achieved through model calibration, an iterative process of comparing the model to actual system performance, and identifying discrepancies.

Experimental Design

  • It's time to figure out which options will be simulated. It's possible that the number of runs determines which options to simulate.

  • For each system design, decisions need to be made concerning

  • Length of the initialization period

  •  Length of simulation runs

  •  Number of replication to be made of each run.

Production runs and analysis

  • They're utilized to calculate performance metrics for the system designs that are being simulated.

More runs

  • Based on the analysis of runs that have been completed. The analyst determines if additional runs are needed and what design those additional experiments should follow.

Documentation and reporting

  • Two types of documentation:

  1. Program documentation

  2. Process documentation

Program documentation
Can be used by the same or various analysts to acquire a better understanding of how the software works. It will be simpler to make further changes. Users of the model can alter the input parameters to improve performance. 
Process documentation
The history of a simulation project is provided. In a final report, the results of all analyses should be presented simply and concisely. This enables to review the final formulation and alternatives, results of the recommended solution to the problem. The final report provides a vehicle of certification

Implementation

  • The success of the preceding phases is dependent on them. The possibility of a vigorous implementation is increased if the model user has been properly involved and understands the nature of the model and its outputs.