Gustavo Diaz is an Associate Professor, College of Information Systems and Technology at University of Phoenix, and an ASQ-certified Six Sigma Black Belt. Gustavo is also an independent quality and process optimization consultant. He is a senior member of the American Society for Quality and a member of The International Society for Process Improvements (ISPI).
Although simulation is one of the most widely used approaches for solving problems in Business Process Management, and other fields dealing with operations management, little is known about systematic management of simulation projects and its life-cycle (Shtub, 2010). This write up will discuss various issues regarding the management of a simulation project and simulation life-cycle models. A life cycle following a Six Sigma DMAIC adoption is discussed in detail. Furthermore, costs associated with simulation projects are defined and analyzed. Typically, there is no clear accounting for simulation projects, as most of those efforts do not follow a strict project management methodology.
A current trend in managing a simulation projects is the allocation of a “simulation expert” along with a process subject matter expert (SME), being a person with functional expertise in the area being modeled. This “team”, in a semi-formal manner, defines the business process requirements, some of the metrics to capture via the simulation efforts and some loosely defined measures of success for the project and the simulation model. The expected outcome of this approach is a deliverable that has had very little scrutiny or input from the user community and/or the process owner. Furthermore, the simulation model itself is a black-box known by those two individuals. Future model updates, maintenance, and process changes, almost invariably calls for a makeover of the model (i.e. start from scratch). In some cases, even the change of a simulation platform is required, as the expertise and knowledge of the simulation model was kept to those few (Shtub, 2010).
Another trend that has developed in the past several years is towards a simulation user as being a person with functional expertise in the process being modeled but no simulation or programming background. This places requirements on the simulation system and the simulation vendors to have an extremely user-friendly interface as well as automation of the statistical analysis and simulation model data repository management (Sturrock, 2009). For the most part, simulation vendors had been able to respond at different levels of effectiveness to this trend (for instance simulation software packages as ProModel, Arena and SIMIO have very different levels of friendly user-interfaces).
One other point of interest in the simulation market space is the recognition that simulation modeling development aligns very closely with application software project development, more than what today’s practitioners want to admit or talk about. As a matter of fact, large simulation projects and large application software projects carry lots of similarities. These similarities include a requirements specification, design, code development and unit testing, and integration and test phase (Hamadani, 1979). Furthermore, the tasks to be accomplished within each of these phases are similar for both types of projects. However, several studies in the literature state that usage of formal Software Requirements Engineering (SRE) activities in Modeling and Simulation (M&S) development and analysis is minimal, at best.
On the other hand, there are several aspects of large simulation projects that are different and more technically difficult than in large application software projects. For example, in most large application software, given a set of input parameters, it is possible to both follow the flow of execution of the program and determine the exact output (Quaglia, Tocantins, 2011). However, in a simulation that contains concurrency, parallel flows and random events, both of these tasks become very difficult, requiring statistical analysis at the end of the development (Clema, et al. 1979).
Other key differences between simulation projects and application software programs include the need to determine at what level of detail the system should be modeled and the need to communicate this abstraction to the process owner and the simulation user community. Table 1 lists characteristics of large simulation projects which are not found in large application software projects.
Table 1 – Large Simulation Projects Characteristics
It is then easier to see the degrees of difficulty in managing simulation projects is rooted in the lack of a simulation project management rigorous methodology.
Next the steps and the appropriate framework for a simulation project development will be discussed.
The purpose of this section is to describe how the life-cycle of developing a simulation project should be performed and resources to ensure a successful outcome. A successful project is defined to be one where the results and recommendations are accepted by the process owner, and the results are delivered under the agreed upon time frame of the project and at or under the agreed upon budget allocated to the simulation project.
Developing simulation projects requires engineers cross-trained in the principles of project management and software development. The short-term projects have a decreased need for project managers and software development techniques. As the estimated length of a simulation projects increases relative importance of applying these techniques increases. The major contents of a simulation model development project are: (a) Project Management practices, aligned with the software development methodologies instituted in any given organization; (b) the simulation model development, and (c) simulation techniques and platform. Energy and care should be devoted in equal parts to these three components. For instance, a powerful simulation language will ease the software development of the model. A strong project manager will most likely have a project under control and under budget. See Figure 1.vv
In general, a project has a definite beginning and end, and frequently produces a product (in our case, the simulation model) that has a life-cycle of its own. In our case of interest, it is the life-cycle of the simulation model itself; with its modifications, enhancements and adaptations from changes in the real-world process (Robinson, Pidd, 1995).
A project can be divided into project phases and collectively the phases are known as a project life-cycle. A project phase is a collection of logically related project activities, usually culminating in the completion of one or more major deliverables. The number of phases is usually small – less than ten – and each project phase can have sub-phases. Figure 1 depicts a Level 0 process map of a simulation project life-cycle, and some sub-phases.
The different phases of the simulation project life-cycle are examined in the following sections. Figure 2 depicts a Level 0 Process map for the simulation project life-cycle, and Figure 3 shows more details of the life-cycle.
As mentioned before, in spite of simulation modeling’s widely acceptance for process problem solving, little is known systematically about the economics of simulation, either in terms of cost or benefits. We have observed that the decision to undertake large-scale simulations is often made casually, subjective and unilateral although this need not be the case. This section will treat how the cost of simulation may be predicted.
In general the topic of cost of a simulation is lightly treated in the literature. The discussion of economics of simulation extremely limited in the standard texts and applications papers in which costs or benefits are discussed even at a rudimentary level (Bathia, Robinson, 1995).
A common view of simulation costs is to consider only the costs of computer running time, which is readily identifiable from accounting information. However, this cost, although often large, may be merely the tip of the proverbial iceberg. The simulation process has four stages:
The last two stages may be iterative or, if the project is a fiasco, may never be reached, and large amounts of monies and energy spent with to return on that investment.
The cost model for a simulation model can be depicted as a decision tree, following the main sub-phases we covered before.
Table 2 depicts what simulation practitioners typically see as the effort (as a percentage) that goes into each step of a simulation model development project (Sturrock, 2009).
|Simulation Model Phase||Total Effort (%)||Notes|
|Data Collection||5-7%||May vary substantially depending upon degreed of data repository readiness|
|Code & Unit Test||15-20%|
|Integration and Test||15-20%|
|User Training||5-10%||Depending of degree of ownership and frequency of modeling performed by the users|
Table 2 – Simulation Project Percent Effort and Cost.
There are several pitfalls we can be faced in each of the different phases of the simulation project (Potter, Wilson, 2007). Table 3 depicts some of them.
|Requirements||o Failure to identify the objectives of the simulation model.|
o Specification of the wrong level of detail for the type of results desired from the simulation.
o Failure to identify the data needed to execute the model and how this data is to be collected.
o Accepting unfeasible requirements.
o Failure to establish a procedure for controlling changes to the requirements.
|Data Collection||o Failure to adequately plan the data collection procedures,|
o Failure to allocate enough time to collect raw data.
o Using statistical procedures that assume that the data is independent and identically distributed when it is not.
|Code & Unit Testing||o Allowing “clever” (read obscure) coding.|
o Lack of enough measurable milestones during this phase.
o Allowing the immediate and uncontrolled response to all requirements changes,
o Adding people after the implementation has started.
|Integration & Testing||o Code walkthroughs; trace and debug analysis.|
o Executing the model under simplified conditions.
o Configuring the model so that the results can be analytically calculated.
o Testing that the input parameters do not change during the course of execution.
|Validation||o Failure to perform Face Validity – Determining that the simulation appears reasonable to model users and others who are knowledgeable about the system being simulated.|
o Failure to perform Statistical Analysis – Using goodness-of-fit tests to validate assumed statistical distributions.
o Failure to perform real system comparison – Comparing model output to the real system output given the same input.
Table 3 – Simulation Project Phases Pitfalls.
Applying a DMAIC-like adaptation to manage simulation projects provides a path for successful simulation projects. It also provides the rigor that a complex simulation project, resource and cost intensive, deserves. Future steps will include an exhaustive listing of pitfalls and their corrective actions, provide further details (level 3) process maps for the different methodology steps. Finally, a set of key performance metrics and a project control dashboard will be completed and justified.
Bathia, V., Robinson, S. (1995), Proceeding WSC ’95 Proceedings of the 27th conference on Winter simulation, IEEE Computer Society, Washington, DC, USA, 1995, pp. 61-67.
Clema, Joe; Rowe, Dale; Roth, Paul (1979). ACM SIGSIM Simulation Digest, ISSN 0163-6103, 04/1979, Volume 10, Issue 3, p. 106.
Hamadani, H. (1979). Simulation and project management (Order No. U440517). . (301410825). Retrieved from http://search.proquest.com/docview/301410825?accountid=458.
Potter, L., Wilson, M. (2007). Happily Ever After: Project Management Tips and Tricks, GLSMA Grantee Meeting – Portland, Oregon.
Robinson, S. and Pidd, M. (1995). Winter Simulation Conference Proceedings, 1995, ISBN 0780330188, pp. 952 – 962.
Shtub, A. (2010). Project management simulation with PTB Project Team Builder, Proceedings of the 2010 Winter Simulation Conference (WSC), ISSN 0891-7736, 2010, ISBN 142449866X, pp. 242 – 253.
Sturrock, D. (2009). TIPS FOR SUCCESSFUL PRACTICE OF SIMULATION, Proceedings of the 2009 Winter Simulation Conference M. D. Rossetti, R. R. Hill, B. Johansson, A. Dunkin and R. G. Ingalls, eds.
Quaglia, E. J. and Tocantins, C. A. (2011). Simulation projects management using Scrum, Proceedings of the 2011 Winter Simulation Conference (WSC), ISSN 0891-7736, 2011, ISBN 1457721082, pp. 3421 – 3430.