To come in
Sewerage and drainpipes portal
  • Lesson summary "Water resources and people
  • Psychology as a special science studies
  • Lightly salted cucumbers in a package quickly, recipe with photo
  • The energy of the electric field of the system of charges
  • Preparation for GCD, drawing up notes
  • How and what metal products can be made for sale by hand?
  • What theorems are simulation modeling based on? The concept of a simulation model

    What theorems are simulation modeling based on? The concept of a simulation model

    Simulation model- a description of the system and its behavior, which can be realized and investigated during operations on a computer.

    Simulation is most often used to describe the properties of a large system, provided that the behavior of its constituent objects is very simple and clearly formulated. The mathematical description is then reduced to the level of static processing of the simulation results when finding the macroscopic characteristics of the system. Such a computer experiment actually pretends to reproduce a natural experiment. Simulation modeling is a special case of mathematical modeling. There is a class of objects for which, for various reasons, analytical models have not been developed, or a method for solving the resulting model has not been developed. In this case, the mathematical model is replaced by a simulator or simulation model. Simulation modeling allows you to test hypotheses, explore the influence of various factors and parameters.

    Simulation modelingIs a method that allows you to build models that describe processes as they would in reality.

    Such a model can be “played” in time for both one test and a given set of them. In this case, the results will be determined by the random nature of the processes. Based on these data, one can obtain fairly stable statistics. Experimenting with a model is called imitation.

    Imitation- comprehension of the essence of the phenomenon without experiments on the object.

    Imitation as a method for solving non-trivial problems received its initial development in connection with the creation of a computer in 1950-1960. Types of imitation: Monte Carlo method (static test method); simulation method (static simulation).

    Demand for simulation: 1) it is expensive and impossible to experiment on a real object; 2) it is impossible to build an analytical model: the system has time, causal relationships, consequences, nonlinearities, random variables; 3) it is necessary to imitate the behavior of the system in time.

    Purpose of simulation- reproduction of the behavior of the investigated system based on the results of the analysis of the most significant relationships between its elements (development of a simulator of the investigated subject area for various experiments).

    Types of simulation.

    Agent based modeling- a relatively new (1990 - 2000) direction in simulation modeling, which is used to study decentralized systems, the dynamics of which is determined not by global rules and laws (as in other modeling paradigms), but vice versa. When these global rules and laws are the result of the individual activity of group members. The goal of agent-based models is to get an idea of \u200b\u200bthese global rules, the general behavior of the system based on assumptions about the individual, private behavior of its individual active objects and the interactions of these objects in the system. An agent is a certain entity with activity, autonomous behavior; can make decisions in accordance with some set of rules, interact with the environment, and change independently.

    Discrete Event Modeling- an approach to modeling that proposes to abstract from the continuous nature of events and consider only the main events of the modeled system, such as: "waiting", "order processing", "movement with a load", "unloading", etc. Discrete-event modeling is the most developed and has a huge scope of applications - from logistics and queuing systems to transport and production systems. This type of modeling is most suitable for modeling production processes. Founded by Jeffrey Gordon in the 1960s.

    System dynamics- for the system under study, graphical diagrams of causal relationships and global influences of some parameters on others in time are built, and then the model created on the basis of these diagrams is simulated on a computer. In fact, this type of modeling more than all other paradigms helps to understand the essence of the ongoing identification of cause-and-effect relationships between objects and phenomena. With the help of system dynamics, models of business processes, city development, production models, population dynamics, ecology and the development of the epidemic are built. The method was founded by Forrester in 1950.

    Some areas of application for simulation are: business processes, warfare, population dynamics, traffic, IT infrastructure, project management, ecosystems. Popular computer simulation systems: AnyLogic, Aimsun, Arena, eM-Plant, Powersim, GPSS.

    Simulation modeling allows you to simulate the behavior of a system over time. Moreover, the advantage is that the time in the model can be controlled: slow it down in the case of fast processes and accelerate it to model systems with slow variability. You can simulate the behavior of those objects, real experiments with which are expensive, impossible and dangerous.

    In this article, we will talk about simulation models. This is a rather complex topic that requires separate consideration. That is why we will try to explain this issue in an accessible language.

    Simulation models

    What are we talking about? To begin with, simulation models are needed to reproduce any characteristics of a complex system in which the elements interact. Moreover, this simulation has a number of features.

    Firstly, it is an object of modeling, which most often represents a complex complex system. Secondly, these are factors of chance that are always present and have a certain impact on the system. Thirdly, it is the need to describe a complex and lengthy process that is observed as a result of modeling. The fourth factor is that it is impossible to obtain the desired results without the use of computer technology.

    Development of a simulation model

    It lies in the fact that each object has a certain set of its characteristics. All of them are stored in the computer using special tables. The interaction of values \u200b\u200band indicators is always described using an algorithm.

    The peculiarity and beauty of modeling is that each of its stages is gradual and smooth, which makes it possible to step by step change the characteristics and parameters and get different results. The program, in which simulation models are involved, displays information about the results obtained, based on certain changes. Their graphic or animated presentation is often used, greatly simplifying the perception and understanding of many complex processes, which are quite difficult to comprehend in an algorithmic form.

    Determinism

    Simulation mathematical models are based on the fact that they copy the qualities and characteristics of some real systems. Let us consider an example when it is necessary to study the number and dynamics of the number of certain organisms. To do this, using modeling, each organism can be considered separately in order to analyze specifically its indicators. In this case, conditions are most often set verbally. For example, after a certain period of time, you can set the reproduction of the organism, and after a longer period, its death. The fulfillment of all these conditions is possible in the simulation model.

    Very often examples of modeling the motion of gas molecules are given, because it is known that they move chaotically. You can study the interaction of molecules with the walls of a vessel or with each other and describe the results in the form of an algorithm. This allows you to average the performance of the entire system and perform analysis. It should be understood that such a computer experiment, in fact, can be called real, since all characteristics are modeled very accurately. But what is the point of this process?

    The fact is that the simulation model allows you to highlight specific and pure characteristics and indicators. It sort of gets rid of random, unnecessary, and a number of other factors that researchers may not even guess about. Note that very often determination and mathematical modeling are similar, unless an autonomous action strategy is to be created as a result. The examples we have looked at above are for deterministic systems. They differ in that they have no elements of probability.

    Random processes

    The name is very easy to understand if you draw a parallel from ordinary life. For example, when you’re queuing at a store that closes after 5 minutes and wondering if you’ll make it in time. Also, the manifestation of randomness can be noticed when you call someone and count the beeps, thinking with what probability you will get through. It may seem surprising to some, but thanks to such simple examples, at the beginning of the last century, the newest branch of mathematics was born, namely the theory of queuing. She uses statistics and probability theory to draw some conclusions. Later, researchers proved that this theory is very closely related to military affairs, economics, production, ecology, biology, etc.

    Monte Carlo method

    An important method for solving the self-service problem is the statistical test method or the Monte Carlo method. Note that the possibilities of studying random processes analytically are rather complicated, and the Monte Carlo method is very simple and universal, which is its main feature. We can consider the example of a store where one or several customers enter, the arrival of patients at the emergency room one by one or in a whole crowd, etc. At the same time, we understand that all these are random processes, and the time intervals between some actions are independent events that are distributed according to laws that can be deduced only by conducting a huge number of observations. Sometimes this is not possible, so an average version is taken. But what is the purpose of simulating random processes?

    The point is that it allows you to get answers to many questions. It is trite to calculate how long a person will have to queue, taking into account all the circumstances. It would seem that this is a fairly simple example, but this is only the first level, and there can be a lot of such situations. Sometimes timing is very important.

    You can also ask a question about how you can allocate time while waiting for service. An even more difficult question concerns how the parameters should be correlated so that the queue never reaches the customer who has just entered. It seems that this is a rather easy question, but if you think about it and start to complicate it at least a little, it becomes clear that it is not so easy to answer.

    Process

    How does random simulation work? Mathematical formulas are used, namely the laws of distribution of random variables. Numeric constants are also used. Note that in this case you do not need to resort to any equations that are used in analytical methods. In this case, the imitation of the same queue that we talked about above just happens. Only at first, programs are used that can generate random numbers and correlate them with a given distribution law. After that, a volumetric, statistical processing of the obtained values \u200b\u200bis carried out, which analyzes the data to determine whether they meet the original goal of modeling. Continuing further, let's say that you can find the optimal number of people who will work in the store so that the line never appears. In this case, the mathematical apparatus used in this case is the methods of mathematical statistics.

    Education

    Little attention is paid to the analysis of simulation models in schools. Unfortunately, this can affect the future quite seriously. Children should know from school some basic principles of modeling, since the development of the modern world is impossible without this process. In a basic computer science course, children can easily use the Life simulation model.

    A more thorough study can be taught in high school or in specialized schools. First of all, it is necessary to study the simulation of random processes. Remember that in Russian schools such a concept and methods are just beginning to be introduced, so it is very important to keep the level of education of teachers, who, with a 100% guarantee, will face a number of questions from children. At the same time, we will not complicate the task, focusing on the fact that we are talking about an elementary introduction to this topic, which can be considered in detail in 2 hours.

    After the children have mastered the theoretical basis, it is worth highlighting the technical issues that relate to the generation of a sequence of random numbers on a computer. At the same time, there is no need to load children with information about how a computing machine works and on what principles analytics is built. From practical skills, they need to be taught to create generators of uniform random numbers on a segment or random numbers according to the distribution law.

    Relevance

    Let's talk a little about why simulation models of management are needed. The fact is that in the modern world it is almost impossible to do without modeling in any area. Why is it so popular and in demand? Simulations can replace real-world events that are required to produce specific results that are too expensive to create and analyze. Or there may be a case when real experiments are prohibited. Also, people use it when it is simply impossible to build an analytical model due to a number of random factors, consequences and causal relationships. The last case when this method is used is when it is necessary to simulate the behavior of a system over a given period of time. For all this, simulators are created that try to reproduce the qualities of the original system as much as possible.

    Kinds

    Simulation models of research can be of several types. So, let's consider the approaches of simulation. The first is system dynamics, which is expressed in the fact that there are interconnected variables, certain accumulators and feedback. Thus, most often two systems are considered in which there are some common characteristics and points of intersection. The next type of modeling is discrete-event. It concerns those cases when there are certain processes and resources, as well as the sequence of actions. Most often, in this way, the possibility of an event is investigated through the prism of a number of possible or random factors. The third type of modeling is agent-based. It consists in studying the individual properties of the organism in their system. This requires indirect or direct interaction of the observed object and others.

    Discrete-event modeling proposes to abstract from the continuity of events and consider only the main points. Thus, random and unnecessary factors are excluded. This method is highly developed, and it is used in many areas: from logistics to production systems. It is he who is best suited for modeling production processes. By the way, it was created in the 1960s by Jeffrey Gordon. System dynamics is a modeling paradigm, where research requires a graphical representation of the connections and mutual influences of some parameters on others. This takes into account the time factor. Only on the basis of all data, a global model is created on the computer. It is this view that allows you to very deeply understand the essence of the event under study and identify some causes and connections. Thanks to this modeling, business strategies, production models, disease development, city planning and so on are built. This method was invented in the 1950s by Forrester.

    Agent-based modeling appeared in the 1990s and is relatively new. This direction is used to analyze decentralized systems, the dynamics of which is determined not by generally accepted laws and rules, but by the individual activity of certain elements. The essence of this modeling is to get an idea of \u200b\u200bthe new rules, to characterize the system as a whole, and to find the connection between individual components. At the same time, an element is studied that is active and autonomous, can make decisions independently and interact with its environment, as well as independently change, which is very important.

    Stages

    Now we will consider the main stages of developing a simulation model. They include its formulation at the very beginning of the process, building a conceptual model, choosing a modeling method, choosing a modeling apparatus, planning, and performing a task. At the last stage, all received data are analyzed and processed. Building a simulation model is a complex and lengthy process that requires a lot of attention and understanding of the essence of the matter. Note that the steps themselves take as long as possible, and the simulation process on a computer takes no more than a few minutes. It is very important to use the correct simulation models, as without it you will not be able to achieve the desired results. Some data will be obtained, but they will not be realistic and not productive.

    Summing up the article, I would like to say that this is a very important and modern industry. We have looked at examples of simulation models to understand the importance of all these points. In the modern world, modeling plays a huge role, since on its basis the economy, urban planning, production and so on develop. It is important to understand that models of simulation systems are in great demand, as they are incredibly beneficial and convenient. Even when creating real conditions, it is not always possible to obtain reliable results, since many scholastic factors always influence, which are simply impossible to take into account.

  • 6 Advanced information technologies in research activities
  • 7. Information technologies for collection, storage and fast processing of scientific information
  • 8 Computing technology, classification of computers by application
  • 9 Problems and risks of introducing information technologies in public practice
  • 10. Peripheral devices. Electronic office equipment
  • 11. Hardware and software for modern procedures of scientific activity.
  • 12. The concept of the model. Basic principles and stages of modeling.
  • 13. Computer simulation
  • 14. Mathematical support of information technologies
  • 15. Packages of applied programs for statistical data analysis
  • 16. Possibilities and features of the Statgraphics package
  • 17. Statgraphics package. One-dimensional statistical analysis: evaluation of numerical characteristics, selection of the distribution law of random variables
  • 18. Statgraphics package. Comparison of several random variables: comparison of numerical characteristics and distribution laws
  • 19. Statgraphics package. Analysis of dependencies between values: regression and correlation analysis. Time series analysis
  • 20. Statgraphics package. Multivariate analysis: principal component analysis, cluster, discriminant analysis
  • 21. Simulation modeling. Principles of Building Simulation Models
  • 22. Simulation experiments. The gpss simulation language - features, structure
  • 23. Purpose and composition of the universal integrated computer mathematics matlab
  • 24 system interface, basic objects and matlab number formats.
  • 25 operators and functions in matlab.
  • 26. Matrix calculations in MathCad
  • 27. Building graphs
  • 28. Fundamentals of programming in MathCad
  • 29. Text and table processors
  • 30. Data analysis by means of Excel
  • 31. Analysis package ms Excel. Descriptive statistics. Histograms.
  • 1. General information
  • 2. Basic built-in statistics functions
  • 3. Analysis of samples and population
  • 4. Analysis tool Descriptive statistics
  • 5. Tool Histogram
  • 6. Rank and Percentile
  • 32. Analysis package ms Excel. Generation of random numbers.
  • 7. Generating random numbers
  • 8. Construction of samples from the general population
  • 9. Calculation of the moving average
  • 10. Linear and exponential regression
  • 33. Correlation-regression analysis in msExcel
  • One Way Regression Analysis Using the Regression Tool
  • 34 Finding the roots of an equation using parameter selection in ms Excel
  • 35 Search for a solution. Solving optimization problems using ms Excel
  • 36. Presentation preparation systems.
  • 37 Web Design Basics
  • 38 Basics of using html
  • Section 1
  • Section 2
  • Section 1
  • Section 2
  • 39. Service tools.
  • 40. Fundamentals of computer graphics.
  • 41 Features and purpose of AutoCad.
  • 42 Project development in Autocad system
  • 43 Models of data presentation. Types, data structures.
  • 44 Databases and data banks. Database design basics.
  • 45 Relational network and hierarchical databases
  • 46. \u200b\u200bDatabase management systems subd
  • 47. ms Access Objects
  • 48. Building different types of queries in ms Access
  • 1 Create a select query using the wizard
  • 2 Create a select query without a wizard
  • 3. Creation of a query with parameters that asks you to enter the selection conditions at each start
  • 49. Forms and reports in ms Access
  • 50. Fundamentals of programming in Visual Basic for Applications
  • 51. Knowledge base
  • 52. Computer networks: Local, corporate, regional, global.
  • 53. Internet services
  • 54. Working with the mail client.
  • 55 Planning joint activities in the corporate network using mail programs.
  • 56. Working with navigation aids in www
  • 57 Methods and tools for finding information on the Internet
  • 1 Search engines
  • 3. Directories of Internet resources
  • 58. Business Internet Technologies
  • 59. Problems of information protection.
  • 60. Organizational methods of information protection
  • 61. Technical and software methods for protecting local data
  • 62. Technical and software methods for protecting distributed data.
  • 1) www service
  • 2) Electronic digital signature (EDS)
  • 63 Trends in the development of information technology
  • 64. Ways to solve the problem of informatization of society
  • 65. New technical means and software products, intellectualization of means
  • 66. Implementation of information technology (IT) in education
  • Chapter 1 general provisions
  • Chapter 2 state regulation and management in the field of information, informatization and information protection
  • Chapter 3 legal regime of information
  • Chapter 4 dissemination and (or) provision of information
  • Chapter 5 information resources
  • Chapter 6 information technology, information systems and information networks
  • Chapter 7 information security
  • Chapter 8 rights and obligations of subjects of information relations. Responsibility for violation of the requirements of legislation on information, informatization and information protection
  • Chapter 9 final provisions
  • August 9, 2010 No. 1174
  • Chapter 1 general provisions
  • Chapter 2 state of development of the information society
  • Chapter 3 goal, objectives and conditions for the development of the information society
  • 21. Simulation modeling. Principles of Building Simulation Models

    Simulation mathematical models are used when the technical system is especially complex or when a high level of detail is required to represent the processes taking place in it. These systems include economic and industrial facilities, seaports, airports, oil and gas pumping complexes, irrigation systems, software for complex control systems, computer networks and many others. For such technical systems, in order to obtain an analytical mathematical model, the researcher is forced to impose strict restrictions on the model and resort to simplifications. In this case, it is necessary to neglect some features of the technical system, which leads to the fact that the mathematical model ceases to be a means of studying a complex system. In simulation models, the simulated behavior algorithm of a technical system approximately reproduces the original process itself in the sense of its functioning in time. At the same time, the elementary phenomena that make up the process are imitated, while maintaining their logical structure and order of flow in time. Thus, a special algorithm is implemented on a computer that reproduces the formalized process of behavior of a technical system. Based on the initial data, this algorithm allows obtaining information about the change in time t of the states and responses of the model. This algorithm can be divided into three functional parts: modeling of elementary subprocesses; taking into account their interaction and combining them into a single process; ensuring the coordinated work of individual subprocesses when implementing a mathematical model on a computer. The influence of random factors on the course of the process is simulated using random number generators with given probabilistic characteristics. During the simulation, statistics on system states and changes in responses are constantly recorded. These statistics are either properly processed during the simulation, or are accumulated and, at the end of the specified simulation interval, TM is processed by statistical methods. As you can see, the idea of \u200b\u200bimitation is attractive in its simplicity, but it is expensive to implement. Therefore, simulation models are used only in cases where other modeling methods are ineffective.

    Model - presentation of an object, system or concept (idea) in some form, different from the form of their real existence.

    Simulation model - logical and mathematical description of the object, which can be used for experimenting on a computer in order to design, analyze and evaluate the functioning of the object.

    Simulation modeling- a method that allows you to build models that describe the processes as they would in reality.

    Such a model can be used any amount of time for one test or a given set of them. In this case, the results will be determined by the random nature of the processes. Based on these data, one can obtain fairly stable statistics.

    goal simulation is to reproduce the behavior of the system under study based on the results of the analysis of the most significant relationships between its elements, or in other words - developing a simulator the studied subject area for various experiments.

    Stages:

      problem formulation;

      building a mathematical model of the system's functioning;

      compilation and debugging of a computer program, including the development of procedures for modeling various random factors;

      planning simulation experiments;

      conducting experiments and processing research results.

    Principles of building an IM model:

    The Δt principle.

    The principle is that the simulation algorithm simulates movement, that is, a change in the state of the system, at fixed times: t, t + Δt, t + 2Δt, t + 3Δt, ...

    For this, a time counter (clock) is started, which on each cycle increases its value t by the value of a time step Δt, starting from zero (start of simulation). Thus, changes in the system are tracked clock by clock at given times: t, t + Δt, t + 2Δt, t + 3Δt, ...

    The principle of special states.

    For example, the state the system is usually in is ordinary condition. Such states are not of interest, although they take most of the time.

    Special conditions are such states at isolated moments of time, in which the characteristics of the system change abruptly. To change the state of the system, a certain reason is needed, for example, the arrival of the next input signal. It is clear that from the point of view of modeling, it is the change in the characteristics of the system that is of interest, that is, the principle requires us to track the moments of the transition of the system from one special state to another.

  • building mathematical models to describe the studied processes;
  • using the latest computers with high speed (millions of operations per second) and capable of dialogue with a person.
  • The essence computer simulation consists in the following: on the basis of a mathematical model, a series of computational experiments is carried out using a computer, i.e. the properties of objects or processes are investigated, their optimal parameters and operating modes are found, the model is refined. For example, having an equation describing the course of a particular process, you can change its coefficients, initial and boundary conditions, investigate how the object will behave in this case. Simulation models are carried out on a computer computational experiments with mathematical models that simulate the behavior of real objects, processes or systems.

    Real processes and systems can be investigated using two types of mathematical models: analytical and simulation.

    In analytical models, the behavior of real processes and systems (RPS) is given in the form of explicit functional dependencies (equations of linear or nonlinear, differential or integral, systems of these equations). However, these dependences can be obtained only for relatively simple RPLs. When the phenomena are complex and diverse, the researcher has to go for simplified representations of complex RRS. As a result, the analytical model becomes too rough an approximation to reality. If, nevertheless, it is possible to obtain analytical models for complex RPS, then they often turn into a difficult problem. Therefore, the researcher is forced to often use simulation modeling.

    Simulation modeling is a numerical method for carrying out computational experiments on a computer with mathematical models that simulate the behavior of real objects, processes and systems in time for a given period. In this case, the functioning of the RPS is divided into elementary phenomena, subsystems and modules. The functioning of these elementary phenomena, subsystems and modules is described by a set of algorithms that simulate elementary phenomena while preserving them logical structure and the sequence of flow in time.

    Simulation modeling - this is a set of methods for algorithmic functioning of research objects, software implementation of algorithmic descriptions, organization, planning and execution on a computer of computational experiments with mathematical models that simulate the functioning of the RPS for a given period.

    Algorithmization of RPS functioning is understood as a step-by-step description of the operation of all its functional subsystems of individual modules with a level of detail corresponding to the set of requirements for the model.

    "Simulation modeling" (IM) is a double term. "Simulation" and "simulation" are synonymous. Virtually all areas of science and technology are models of real processes. To distinguish mathematical models from each other, researchers began to give them additional names. Term "simulation modeling" means that we are dealing with such mathematical models, with the help of which it is impossible to calculate or predict the behavior of the system in advance, and to predict the behavior of the system it is necessary computational experiment (imitation) on a mathematical model with given initial data.

    The main advantage of IM:

    1. the ability to describe the behavior of components (elements) of processes or systems at a high level of detail;
    2. no restrictions between the parameters of the MI and the state of the external environment of the RPD;
    3. the ability to study the dynamics of the interaction of components in time and space of system parameters;

    These advantages make the imitation method widespread.

    1. If there is no complete formulation of the research problem and there is a process of cognition of the object of modeling. Simulation model serves as a means of studying the phenomenon.
    2. If analytical methods are available, but the mathematical processes are complex and time consuming, and simulation modeling gives an easier way to solve the problem.
    3. When, in addition to assessing the influence of parameters (variables) of a process or system, it is desirable to observe the behavior of the components (elements) of a process or system (SS) over a certain period.
    4. When simulation modeling turns out to be the only way to study a complex system due to the impossibility of observing phenomena in real conditions (thermonuclear fusion reactions, space exploration).
    5. When it is necessary to control the course of processes or the behavior of systems by slowing down or accelerating the phenomena during the simulation.
    6. When training specialists for new equipment, when simulation models provides the opportunity to acquire skills in the operation of new equipment.
    7. When new situations in RPM are studied. In this case, imitation serves to test new strategies and rules for carrying out field experiments.
    8. When the sequence of events in the designed PS is of particular importance and the model is used to predict bottlenecks in the operation of the RPS.

    However, MI, along with advantages, also has disadvantages:

    1. Developing a good IM is often more expensive than creating an analytical model and is time-consuming.
    2. It may turn out that the MI is inaccurate (which is often), and we are not able to measure the degree of this inaccuracy.
    3. Often, researchers turn to MI without realizing the difficulties they will encounter and commit a number of methodological errors.

    Nevertheless, IM is one of the most widely used methods for solving problems of synthesis and analysis of complex processes and systems.

    One of the types simulation is a statistical simulation modeling, allowing to reproduce the functioning of complex random processes on a computer.

    In the study of complex systems subject to random disturbances, probabilistic analytical models and probabilistic simulation models.

    In probabilistic analytical models, the influence of random factors is taken into account by specifying the probabilistic characteristics of random processes (probability distribution laws, spectral densities or correlation functions). Moreover, the construction of probabilistic analytical models is a complex computational task... Therefore, probabilistic analytical modeling is used to study relatively simple systems.

    It is noted that the introduction of random perturbations into simulation models does not introduce fundamental complications, therefore, the study of complex random processes is currently being carried out, as a rule, on simulation models.

    In probabilistic simulation modeling they operate not with the characteristics of random processes, but with specific random numerical values \u200b\u200bof the PS parameters. In this case, the results obtained when playing back on simulation model of the considered process are random realizations. Therefore, to find objective and stable characteristics of the process, it is required to reproduce it many times, followed by statistical processing of the data obtained. That is why the study of complex processes and systems subject to random disturbances using simulation it is usually called statistical modeling.

    Simulation models

    Simulation modelreproduces behaviordevelopment of a complex system of interacting elementscomradeSimulation is characterized by the presence of the following circumstances (all or some of them simultaneously):

    • the object of modeling is a complex heterogeneous system;
    • the simulated system contains factors of random behavior;
    • it is required to obtain a description of the process developing in time;
    • it is fundamentally impossible to obtain simulation results without using a computer.

    The state of each element of the simulated system is described by a set of parameters that are stored in the computer memory in the form of tables. The interactions of the system elements are described algorithmically. Simulation is carried out step by step. At each step of the simulation, the values \u200b\u200bof the system parameters change. The program that implements the simulation model reflects the change in the state of the system, giving the values \u200b\u200bof its desired parameters in the form of tables in time steps or in a sequence of events occurring in the system. To visualize the simulation results, graphical representation is often used, incl. animated.

    Deterministic modeling

    The simulation model is based on imitation of a real process (imitation). For example, modeling the change (dynamics) of the number of microorganisms in a colony, one can consider many separate objects and monitor the fate of each of them, setting certain conditions for its survival, reproduction, etc. These conditions are usually given verbally. For example: after a certain period of time, the microorganism is divided into two parts, and after another (longer) time interval, it dies. The fulfillment of the described conditions is algorithmically implemented in the model.

    Another example: modeling the movement of molecules in a gas, when each molecule is represented as a ball with a certain direction and speed of movement. The interaction of two molecules or a molecule with the vessel wall occurs according to the laws of absolutely elastic collision and is easily described algorithmically. The integral (general, averaged) characteristics of the system are obtained at the level of statistical processing of the simulation results.

    Such a computer experiment actually pretends to reproduce a natural experiment. To the question: "Why do you need to do this?" the following answer can be given: simulation modeling allows us to single out "in a pure form" the consequences of hypotheses embedded in the concept of microevents (i.e. at the level of system elements), saving them from the influence of other factors inevitable in a natural experiment, which we cannot even talk about suspect. If such modeling also includes elements of a mathematical description of processes at the micro level, and if the researcher does not set the task of finding a strategy for regulating the results (for example, managing the number of a colony of microorganisms), then the difference between the simulation model and the mathematical (descriptive) model turns out to be rather arbitrary.

    The above examples of simulation models (evolution of a colony of microorganisms, movement of molecules in a gas) lead to determinirobathdescription of systems. They lack elements of probability, randomness of events in simulated systems. Let's consider an example of modeling a system with these qualities.

    Random Process Models

    Who hasn’t happened to stand in line and wonder if he will have time to make a purchase (or pay rent, ride a carousel, etc.) in some time at his disposal? Or, trying to call the information desk and bumping into short beeps several times, get nervous and evaluate - will I get through or not? From such "simple" problems at the beginning of the 20th century, a new branch of mathematics was born - the theory of queuing, using the apparatus of the theory of probability and mathematical statistics, differential equations and numerical methods. Subsequently, it turned out that this theory has numerous outputs in economics, military affairs, organization of production, biology and ecology, etc.

    Computer simulation in solving queuing problems, implemented in the form of a statistical test method (Monte Carlo method), plays an important role. The capabilities of analytical methods for solving real-life queuing problems are very limited, while the statistical test method is universal and relatively simple.

    Let's consider the simplest task of this class. There is a store with one seller, which randomly includes customers. If the seller is free, then he begins to serve the buyer at once, if several buyers come in at the same time, a queue forms. There are many other similar situations:

    • repair area i auto service and buses that got off the line due to breakdown;
    • trauma center and patients who came to an appointment on the occasion of an injury (i.e. without an appointment system);
    • a telephone exchange with one entrance (or one telephone operator) and subscribers who are queued when the entrance is busy (such a system sometimes
      practiced);
    • a local network server and personal machines at the workplace that send a message to a server capable of receiving and processing no more than one message at a time.

    The process of customers coming to a store is a random process. The time intervals between the arrivals of any consecutive pair of buyers are independent random events distributed according to a certain law that can only be established by numerous observations (or some plausible version of it is taken for modeling). The second random process in this problem, which has nothing to do with the first, is the duration of service for each customer.

    The purpose of modeling systems of this type is to answer a number of questions. A relatively simple question - what is the average time and queues to stand for given distribution laws of the above random variables? More difficult question; what is the distribution of waiting times for service in the queue? An equally difficult question: under what ratios of the parameters of the input distributions a crisis will occur, in which the turn of the newly entered customer will never reach? If you think about this relatively simple task, the possible questions will multiply.

    The modeling method looks like this in general terms. Used mathematical formulas - laws of distribution of initial random variables; used numeric constants - empirical parameters included in these formulas. No equations are being solved that would be used in the analytical study of this problem. Instead, a simulation of the queue occurs, played with the help of computer programs that generate random numbers with given distribution laws. Then statistical processing of the aggregate of the obtained values \u200b\u200bof the quantities determined by the given goals of modeling is carried out. For example, you can find the optimal number of sellers for different periods of the store's opening time, which will ensure that there are no queues. The mathematical apparatus used here is called methods of mathematical statistics.

    Another example is described in the article "Modeling ecological systems and processes" imitationlegsimulation: one of many models of the predator-prey system. Individuals of species that are in these relationships, according to certain rules containing elements of randomness, move, predators eat prey, both of them reproduce, etc. Suchthe model does not contain any mathematical formulas, but requires statstaticprocessing the results.

    An example of a deterministic algorithm simulation model

    Let's consider a simulation model of the evolution of a population of living organisms, known as "Life", which is easy to implement in any programming language.

    To construct a game algorithm, consider a square field from n - \\ - 1columns and rows with the usual numbering from 0 to p.For convenience, we define the extreme boundary columns and rows as a "dead zone"; they play only an auxiliary role.

    For any inner cell of the field with coordinates (i, j), 8 neighbors can be determined. If the cell is "alive", paint over it, if the cell is "dead", it empty.

    Let's set the rules of the game. If the cell (i, j) is "alive" and is surrounded by more than three "live" cells, it dies (from overpopulation). A "living" cell also dies if there are less than two "living" cells in its environment (from loneliness). A "dead" cell comes to life if three "living" cells appear around it.

    For convenience, we introduce a two-dimensional array AND, whose elements take on the value 0 if the corresponding cell is empty, and 1 if the cell is "live". Then the algorithm for determining the state of the cell with the coordinate (i, j) can be defined as follows:

    S: \u003d A + A + A + A + A + A + A + A;
    If (A \u003d 1) And (S\u003e 3) Or (S< 2)) Then B: =0;
    If (A \u003d 0) And (S \u003d 3)
    Then B: \u003d 1;

    Here, the array В defines the coordinates of the field at the "next stage. For all inner cells from i \u003d 1 to n - 1 and j \u003d 1 to n - 1, the above is true. Note that subsequent generations are determined in the same way, it is only necessary to carry out the reassignment procedure:

    For I: \u003d 1 To N - 1 Do
    For J: \u003d 1 To N - 1 Do
    A: \u003d B;

    It is more convenient to display the field state on the display screen not in a matrix, but in a graphical form.
    It remains only to determine the procedure for setting the initial configuration of the playing field. For a random determination of the initial state of cells, the algorithm

    For I: \u003d 1 To K Do
    Begin K1: \u003d Random (N-1);
    K2: \u003d Random (N-1) +1;
    End;

    It is more interesting for the user to set the initial configuration himself, which is easy to implement. As a result of experiments with this model, one can find, for example, stable dispersal of living organisms that never die, remaining unchanged or changing their configuration with a certain period. The settlement by the "cross" is absolutely unstable (dying in the second generation).

    In a basic computer science course, students can implement the Life simulation model within the Introduction to Programming section. A more thorough mastering of simulation modeling can occur in high school in a specialized or elective computer science course. Further, this option will be discussed.

    The beginning of the study is a lecture on simulation of random processes. In the Russian school, the concepts of probability theory and mathematical statistics are just beginning to be introduced into the course of mathematics, and the teacher should be prepared to make an introduction himself to this material, which is essential for the formation of a worldview and mathematical culture. We emphasize that we are talking about an elementary introduction to the range of discussed concepts; this can be done in 1-2 hours.

    Then we discuss technical issues related to the generation of sequences of random numbers with a given distribution law on a computer. In this case, one can rely on the fact that in every universal programming language there is a generator of random numbers uniformly distributed over the interval from 0 to 1. At this stage, it is inappropriate to go into the complex issue of the principles of its implementation. Based on the available random number sensors, we show how you can arrange

    a) generator of uniformly distributed random numbers on any interval [a, b];

    b) a random number generator for almost any distribution law (for example, using an intuitively clear "selection-rejection" method).

    It is advisable to start the consideration of the queuing problem described above with a discussion of the history of solving queuing problems (Erlang's problem of servicing requests at a telephone exchange). This is followed by a consideration of the simplest problem, which can be formulated on the example of forming and examining a queue in a store with one seller. Note that at the first stage of modeling the distribution of random variables at the input can be assumed to be equally probable, which, although not realistic, removes a number of difficulties (to generate random numbers, you can simply use the sensor built into the programming language).

    We draw the attention of students to what questions are raised in the first place when modeling systems of this type. First, it is the calculation of average values \u200b\u200b(mathematical expectations) of some random variables. For example, what is the average time to queue at the counter? Or: find the average time a seller spent waiting for a buyer.

    The teacher's task, in particular, is to clarify that sample means are themselves random variables; in another sample of the same size, they will have different values \u200b\u200b(for large sample sizes - not too different from each other). Further, options are possible: in a more prepared audience, it is possible to show a method for estimating the confidence intervals in which the mathematical expectations of the corresponding random variables are located at given confidence probabilities (using methods known from mathematical statistics without attempting to justify). In a less prepared audience, you can limit yourself to a purely empirical statement: if in several samples of equal size the mean values \u200b\u200bcoincide in a certain decimal place, then this sign is most likely correct. If the simulation fails to achieve the desired accuracy, increase the sample size.

    In an even more mathematically prepared audience, one can pose the question: what is the distribution of random variables that are the results of statistical modeling, for given distributions of random variables that are its input parameters? Since the presentation of the corresponding mathematical theory in this case is impossible, one should restrict oneself to empirical methods: constructing histograms of the final distributions and comparing them with several typical distribution functions.

    After mastering the primary skills of this modeling, we turn to a more realistic model, in which the input streams of random events are distributed, for example, according to Poisson. This will require students to additionally master the method of generating sequences of random numbers with the specified distribution law.

    In the problem considered, as in any more complex problem of queues, a critical situation may arise when the queue grows indefinitely with time. Modeling the approach to a critical situation as one of the parameters increases is an interesting research problem for the most prepared students.

    Using the queue problem as an example, several new concepts and skills are worked out at once:

    • concepts of random processes;
    • concepts and basic skills of simulation;
    • construction of optimization simulation models;
    • building multi-criteria models (by solving problems of the most rational customer service in combination with interests
      store owner).

    The task :

      1. Make a diagram of key concepts;
    • Pick up practical tasks with solutions for basic and specialized computer science courses.