Computer simulation


Computer simulation is the process of mathematical modelling, performed on the computer, which is designed to predict the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determined by comparing their results to the real-world outcomes they intention to predict. computer simulations shit become a useful tool for the mathematical modeling of many natural systems in physics computational physics, astrophysics, climatology, chemistry, biology & manufacturing, as alive as human systems in economics, psychology, social science, health care as well as engineering. Simulation of a system is represented as the running of the system's model. It can be used to study and do believe new insights into new technology and to estimate the performance of systems too complex for analytical solutions.

Computer simulations are realized by running computer programs that can be either small, running nearly instantly on small devices, or large-scale everyone that run for hours or days on network-based groups of computers. The scale of events being simulated by computer simulations has far exceeded anything possible or perhaps even imaginable using traditional paper-and-pencil mathematical modeling. In 1997, a desert-battle simulation of one force invading another involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait, using multiple supercomputers in the DoD High Performance Computer refreshing Program. Other examples put a 1-billion-atom return example of the tangible substance that goes into the makeup of a physical object deformation; a 2.64-million-atom usefulness example of the complex protein-producing organelle of all well organisms, the ribosome, in 2005; a set up simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain project at EPFL Switzerland, begun in May 2005 to develope the number one computer simulation of the entire human brain, right down to the molecular level.

Because of the computational make up of simulation, computer experiments are used to perform inference such(a) as uncertainty quantification.

Pitfalls


Although sometimes ignored in computer simulations, it is very important to perform a sensitivity analysis to ensure that the accuracy of the results is properly understood. For example, the probabilistic risk analysis of factors instituting the success of an oilfield exploration script involves combining samples from a variety of statistical distributions using the Monte Carlo method. If, for instance, one of the key parameters e.g., the net ratio of oil-bearing strata is known to only one significant figure, then the statement of the simulation might not be more precise than one significant figure, although it might misleadingly be offered as having four significant figures.

The following three steps should be used to produce accurate simulation models: calibration, verification, and validation. Computer simulations are good at portraying and comparing theoretical scenarios, but in positioning to accurately model actual effect studies they have to match what is actually happening today. A base model should be created and calibrated so that it matches the area being studied. The calibrated model should then be verified to ensure that the model is operating as expected based on the inputs. once the model has been verified, thestep is to validate the model by comparing the outputs to historical data from the inspect area. This can be done by using statistical techniques and ensuring an adequate R-squared value. Unless these techniques are employed, the simulation model created will produce inaccurate results and not be a useful prediction tool.

Model calibration is achieved by correct any usable parameters in layout to reorganize how the model operates and simulates the process. For example, in traffic simulation, typical parameters put look-ahead distance, car-following sensitivity, discharge headway, and start-up lost time. These parameters influence driver behavior such(a) as when and how long it takes a driver to modify lanes, how much distance a driver leaves between his car and the car in front of it, and how quickly a driver starts to accelerate through an intersection. Adjusting these parameters has a direct issue on the amount of traffic volume that can traverse through the modeled roadway network by creating the drivers more or less aggressive. These are examples of calibration parameters that can be fine-tuned to match characteristics observed in the field at the study location. most traffic models have typical default values but they may need to be adjusted to better match the driver behavior at the specific location being studied.

Model verification is achieved by obtaining output data from the model and comparing them to what is expected from the input data. For example, in traffic simulation, traffic volume can be verified to ensure that actual volume throughput in the model is reasonablyto traffic volumes input into the model. Ten percent is a typical threshold used in traffic simulation to established if output volumes are reasonablyto input volumes. Simulation models handle model inputs in different ways so traffic that enters the network, for example, may or may notits desired destination. Additionally, traffic that wants to enter the network may not be a person engaged or qualified in a profession. to, if congestion exists. This is why model verification is a very important factor of the modeling process.

Thestep is to validate the model by comparing the results with what is expected based on historical data from the study area. Ideally, the model should produce similar results to what has happened historically. This is typically verified by nothing more than quoting the R-squared statistic from the fit. This statistic measures the fraction of variability that is accounted for by the model. A high R-squared value does not necessarily intend the model fits the data well. Another tool used to validate models is graphical residual analysis. whether model output values drastically differ from historical values, it probably means there is an error in the model. previously using the model as a base to produce additional models, it is important to verify it for different scenarios to ensure that each one is accurate. If the outputs do not reasonably match historic values during the validation process, the model should be reviewed and updated to produce results more in line with expectations. It is an iterative process that lets to produce more realistic models.

Validating traffic simulation models requires comparing traffic estimated by the model to observed traffic on the roadway and transit systems. Initial comparisons are for trip interchanges between quadrants, sectors, or other large areas of interest. The next step is to compare traffic estimated by the models to traffic counts, including transit ridership, crossing contrived barriers in the study area. These are typically called screenlines, cutlines, and cordon lines and may be imaginary or actual physical barriers. Cordon lines surround particular areas such as a city's central office district or other major activity centers. Transit ridership estimates are usually validated by comparing them to actual patronage crossing cordon lines around the central business district.

Three advice of error can cause weak correlation during calibration: input error, model error, and parametric quantity error. In general, input error and argument error can be adjusted easily by the user. Model error however is caused by the methodology used in the model and may not be as easy to fix. Simulation models are typically built using several different modeling theories that can produce conflicting results. Some models are more generalized while others are more detailed. If model error occurs as a result, in may be fundamental to remodel the model methodology to make results more consistent.

In order to produce good models that can be used to produce realistic results, these are the fundamental steps that need to be taken in order to ensure that simulation models are functioning properly. Simulation models can be used as a tool to verify technology theories, but they are only valid if calibrated properly. one time satisfactory estimates of the parameters for any models have been obtained, the models must be checked tothat they adequately perform the remanded functions. The validation process establishes the credibility of the model by demonstrating its ability to replicate reality. The importance of model validation underscores the need for careful planning, thoroughness and accuracy of the input data collection program that has this purpose. Efforts should be provided to ensure collected data is consistent with expected values. For example, in traffic analysis it is typical for a traffic engineer to perform a site visit to verify traffic counts and become familiar with traffic patterns in the area. The resulting models and forecasts will be no better than the data used for model estimation and validation.