We often get asked about CB Monte Carlo simulation models or OptQuest optimization models which produce different result each time a complete simulation or optimization is run. There are a few situations which can result in this. In a series of posts, I will try to explain each reason and possible workaround.
The first one is actually an interesting one. Most of our power users are aware that we publish a few functions in Excel from CB, which can be used as functions in Excel spreadsheet models. These functions show up when you select 'Crystal Ball' in the 'Insert Function' dialog in Excel. There are a bunch of distribution assumption specific functions there, like CB.Beta(...), CB.Normal(...) etc., which returns random numbers from these distributions. The caveat to be aware of when using these functions is that, they are not tied to the seed value which you can set in CB 'Run Preferences'. That is, irrespective of whether you select to use a specific seed value or not in this preference, these functions will always use new seed value for each evaluation of the function. So, each time you run a simulation, reset it and rerun the simulation, you are going to get new set of values, and hence, the result will be different.
This type of function result is termed as 'volatile' in Excel, an example of which is the RAND() function. In fact, heads up, using RAND() function in your model also causes the same problem: you get different result every time you run the simulation (I will do a separate short post on that later). These functions return new values each time the worksheet is recalculated.
Workaround: Use the CB assumptions. Either define them using the GUI or using the Developer Kit. You will get the usual color-coded cells, and these assumptions will listen to the preferences you set in 'Run Preferences' dialog.
Links to other posts in this series:
Part 2: Excel RAND() function in CB models - handle with care
Part 3: Running OptQuest with low-confidence testing at Extreme Speed
Part 4: Multi-threading and seed