Dear Vensim Community,
We would like to alert you to some exciting changes to the Vensim team.
Bob Eberlein, who has been head of development since almost the beginning of Vensim, has decided to embark on a new chapter of his life, starting in January. While we are sad to see him go, we greatly appreciate all of his efforts and accomplishments over the past 22 years, and wish him the very best in his new adventures.
Vensim is extremely important to our efforts here at Ventana Systems and we know that it is also important to many of you in the System Dynamics community. We are fully committed to maintaining Vensim as the leading System Dynamics software platform and to extending its features and capabilities. We have increased our investment in Vensim with the following team:
• Tom Fiddaman has taken on an additional role as Vensim Product Manager. He will make sure that new releases of Vensim address market requirements and opportunities. He will facilitate information flow between the community, our user base, and the Vensim design team.
• Tony Kennedy from Ventana Systems UK will lead the Customer Services functions, including order fulfillment, bug resolution, and the master training schedule. He will also support the Distributor network. Tony has been working with UK Vensim customers for over 10 years.
• Larry Yeager has recently joined Ventana Systems to head the future development of Vensim. Larry has led the development of many software products and applications, including the Jitia System Dynamics software for PA Consulting.
• We have formed a steering team that will provide guidance and expertise to our future product development. This team includes Alan Graham, Tony Kennedy, Tom Fiddaman, Marios Kagarlis, and David Peterson.
We are very excited about the future and look forward to continuing our great relationships with you, our clients and friends.
President & CEO
Ventana Systems, Inc.
MIT’s Climate Collaboratorium has posted java code that it used to wrap C-LEARN as a web service using the multicontext .dll. If you’re doing something similar, you may find the code useful, particularly the VensimHelper class. The liberal MIT license applies. However, be aware that you’ll need a license for the Vensim multicontext .dll to go with it.
Vensim doesn’t have a function for the cumulative normal distribution, but it’s easy to implement via a macro. I used to use a polynomial cited in Numerical Recipes (error function, Ch. 6.2):
NCDF = 1-Complementary Normal CDF
Complementary Normal CDF= ERFCy/2 ~ dmnl ~ |
ERFCy = IF THEN ELSE(y>=0,ans,2-ans) ~ dmnl ~ http://www.library.cornell.edu/nr/bookcpdf/c6-2.pdf |
y = x/sqrt(2) ~ dmnl ~ |
ans=t*exp(-z*z-1.26551+t*(1.00002+t*(0.374092+t*(0.0967842+ t*(-0.186288+t*(0.278868+t*(-1.1352+t*(1.48852+ t*(-0.822152+t*0.170873))))))))) ~ dmnl ~ |
t=1/(1+0.5*z) ~ dmnl ~ |
z = ABS(y) ~ dmnl ~ |
:END OF MACRO:
NCDF2 = IF THEN ELSE(x >= 0,
(1 - c * exp( -x * x / 2 ) * t *
( t *( t * ( t * ( t * b5 + b4 ) + b3 ) + b2 ) + b1 )), ( c * exp( -x * x / 2 ) * t *
( t *( t * ( t * ( t * b5 + b4 ) + b3 ) + b2 ) + b1 ))
Implements algorithm 26.2.17 from Abromowitz and Stegun, Handbook of Mathematical
Functions. It has a maximum absolute error of 7.5e^-8.
c = 0.398942
t = IF THEN ELSE( x >= 0, 1/(1+p*x), 1/(1-p*x))
b5 = 1.33027
b4 = -1.82126
b3 = 1.78148
b2 = -0.356564
b1 = 0.319382
p = 0.231642
:END OF MACRO:
In advanced Vensim versions, paste the macro into the header of your model (View>As Text). Otherwise, you can implement the equations inside the macro directly in your model.
This is a little experimental model that I developed to investigate stochastic allocation of rental cars, in response to a Vensim forum question.
There’s a single fleet of rental cars distributed around 50 cities, connected by a random distance matrix (probably not physically realizable on a 2D manifold, but good enough for test purposes). In each city, customers arrive at random, rent a car if available, and return it locally or in another city. Along the way, the dawdle a bit, so returns are essentially a 2nd order delay of rentals: a combination of transit time and idle time.
The two interesting features here are:
- Proper use of Poisson arrivals within each time step, so that car flows are dimensionally consistent and preserve the integer constraint (no fractional cars)
- Use of Vensim’s ALLOC_P/MARKETP functions to constrain rentals when car availability is low. The usual approach, setting actual = MIN(desired, available/TIME STEP), doesn’t work because available is subscripted by 50 cities, while desired has 50 x 50 origin-destination pairs. Therefore the constrained allocation could result in fractional cars. The alternative approach is to set up a randomized first-come, first-served queue, so that any shortfall preserves the integer constraint.
The interesting experiment with this model is to lower the fleet until it becomes a constraint (at around 10,000 cars).
Documentation is sparse, but units balance.
Requires an advanced Vensim version (for arrays) or the free Model Reader.
Update, with improved distribution choice and smaller array dimensions for convenience:
As a prelude to my next look at alternative fuels models, some thoughts on spreadsheets.
Everyone loves to hate spreadsheets, and it’s especially easy to hate Excel 2007 for rearranging the interface: a productivity-killer with no discernible benefit. At the same time, everyone uses them. Magne Myrtveit wonders, Why is the spreadsheet so popular when it is so bad?
Spreadsheets are convenient modeling tools, particularly where substantial data is involved, because numerical inputs and outputs are immediately visible and relationships can be created flexibly. However, flexibility and visibility quickly become problematic when more complex models are involved, because:
- Structure is invisible and equations, using row-column addresses rather than variable names, are sometimes incomprehensible.
- Dynamics are difficult to represent; only Euler integration is practical, and propagating dynamic equations over rows and columns is tedious and error-prone.
- Without matrix subscripting, array operations are hard to identify, because they are implemented through the geography of a worksheet.
- Arrays with more than two or three dimensions are difficult to work with (row, column, sheet, then what?).
- Data and model are mixed, so that it is easy to inadvertently modify a parameter and save changes, and then later be unable to easily recover the differences between versions. It’s also easy to break the chain of causality by accidentally replacing an equation with a number.
- Implementation of scenario and sensitivity analysis requires proliferation of spreadsheets or cumbersome macros and add-in tools.
- Execution is slow for large models.
- Adherence to good modeling practices like dimensional consistency is impossible to formally verify
For some of the reasons above, auditing the equations of even a modestly complex spreadsheet is an arduous task. That means spreadsheets hardly ever get audited, which contributes to many of them being lousy. (An add-in tool called Exposé can get you out of that pickle to some extent.)
There are, of course, some benefits: spreadsheets are ubiquitous and many people know how to use them. They have pretty formatting and support a wide variety of data input and output. They support many analysis tools, especially with add-ins.
For my own purposes, I generally restrict spreadsheets to data pre- and post-processing. I do almost everything else in Vensim or a programming language. Even seemingly trivial models are better in Vensim, mainly because it’s easier to avoid unit errors, and more fun to do sensitivity analysis with Synthesim.
Replicated by: Tom Fiddaman
Citation: Hatlebakk, Magnus, & Moxnes, Erling (1992). Misperceptions and Mismanagement of the Greenhouse Effect? The Simulation Model . Report # CMR-92-A30009, December). Christian Michelsen Research.
This is a climate-economy model, of about the same scale and vintage as Nordhaus’ original DICE model. It’s more interesting in some respects, because it includes path-dependent reversible and irreversible emissions reductions. As I recall, the original also had some stochastic elements, not active here. This version has no units; hopefully I can get an improved version online at some point.
Model Name: The Energy Transition and the Economy: A System Dynamics Approach
Source: Replicated by Miguel Vukelic (a heroic effort)
Units balance: Yes
Format: Vensim (Contains data variables and thus requires an advanced version or the free Model Reader)
Illustrations of a ‘Normal’ (first order) Outflow, a Delay Outflow, and a Fixed Delay Outflow
Model Name: payments, penalties, and environmental ethic
Citation: Dudley, R. 2007. Payments, penalties, payouts, and environmental ethics: a system dynamics examination Sustainability: Science, Practice, & Policy 3(2):24-35. http://ejournal.nbii.org/archives/vol3iss2/0706-013.dudley.html.
Source: Richard G. Dudley
Copyright: Richard G. Dudley (2007)
License: Gnu GPL
Peer reviewed: Yes (probably when submitted for publication?)
Units balance: Yes
Target audience: People interested in the concept of payments for environmental services as a means of improving land use and conservation of natural resources.
Questions answered: How might land users’ environmental ethic be influenced by, and influence, payments for environmental services.