Skip to content

Latest commit

 

History

History
96 lines (82 loc) · 3.75 KB

README.md

File metadata and controls

96 lines (82 loc) · 3.75 KB

System

Getting started

Import the following classes:

from system import Config, Environment, System

Load the config:

config = Config('config.yaml')

The config is further explained in section Config. Note that the job names specified in the config need to be identical to the data elements in the notation.

Create an instance of the simulation environment:

env = Environment()

The Environment class is just a wrapper of the simpy.Environment.

Create a system instance:

system = System(config, notation, env=env)

The System needs the aforementioned config and environment, it also relies on a notation instance. For information how the notation is created please read the notation/README.md of the notation package. Note that the arrival process, the process creating the jobs, needs to be explicitly specified. It is the first node of any given notation.

Then build and run the simulation with:

system.build()
system.run()

The jobs of each process over time can be plotted by:

system.logger.plot()

The job distribution of each process over time can be saved in a pickled xarray DataArray by:

system.logger.save('./graphs/-/da.pkl')

This creates a DataArray with the dimensions time (step), process, and job.

Config

The following is an example config.
Note that processes are specified according to the node order generated by the notation parser. This means the first entry specifies process 0, which is the node with label 0 in the notation, the second entry specifies process 1, which is the node with label 1 in the notation, and so on. If there is confusion concerning process order, let the notation draw the graph and look at the node labels. In addition to the node order, processes can also be given names. This is optional but needs to match with the node name specified in the notation.

until: 10000                      # number of steps the simulation will run
loggingRate: 0.1                  # rate at which the logger will log. 0.1 means it will log 10 time per step
randomSeed: 42                    # seed for initializing the numpy.SeedSequence from which the random number generators are created
jobs:                             # list describing all jobs
  - name: 'A'
    arrivalProbability: 0.1
    failureRate: 0
  - name: 'B'                     # name of the job
    arrivalProbability: 0.1       # probability with which a created job is of this job type. Needs to sum to 1.0 over all jobs
    failureRate: 0                # probability with which the job might fail quality control 
  - name: 'C'
    arrivalProbability: 0.8
    failureRate: 0
arrivalProcess:
  beta: 1.0                       # this is the beta value of the exponential function used for simulating the job arrival time
processes:                        # describes the processing time drawn from the normal distribution of a job
  - mean: 1.0
    std: 0.2
  - mean: 1.0                     # mean value for the processing time of this specific process
    std: 0.2                      # std deviation of the processing time of this specific process
  - mean: 1.0
    std: 0.2
  - mean: 1.0
    std: 0.2
    name: 'abc'                   # optional name

Optional entries are:

jobArrivalPath: './graphs/-/job_arrivals.yaml'  # specifies a yaml file with a list of time and job entries, 
                                                # specifying the job arrivals 
continueWithRndJobs: False                      # if set to False and jobArrivalPath is given, 
                                                # no other jobs are created after the given jobs are processed

An example job arrival file can be found here.