Computer simulations are used when a real-world process, system or event situation is not available or feasible.
A computer simulation requires modelling the environment or the hardware. Hardware that is not available, that can break during testing or that is too expensive can be simulated in such a way that other components in the system do not know that a stub is being used instead of the real hardware. A simulation allows you to run an experiment many times with different parameters.
- A design of the model.
- The ability to develop a model of the real hardware or situation.
- A simulation of the runtime environment.
- An evaluation of the simulation (i.e. does it work according to the intended model?).
To work on signal processing algorithms, raw sensor data can be simulated by sending a list of simulated readings instead of using a real sensor during development.