top of page

General Mistakes

Invalid model assumptions

Model too complex

Model too simple

Internal inconsistency (Not paying attention to error messages)

 

Invalid model assumptions

Every model makes assumptions, and many assumptions are not or cannot be validated (if they were validated, they would be parameters, not assumptions). However, there are several types of checks that a user can make to ensure that the model assumptions are consistent with each other, and with the available qualitative information about the species' life history.

It is important to remember that a program such as RAMAS GIS is not a model, but a tool that helps users build their own models. RAMAS GIS has been validated at the program level by checking all subunits and algorithms of the programs, by checking the lower-level algorithms for consistency (e.g., random distributions of vital rates), and by making sure that the program does what is described in the manual and help files. The validation at the model level is the user's responsibility. For tips on how to do this, see the following topics in the RAMAS Metapop help files (the sections given below provide short introductions):

Validation and Testing

RAMAS GIS and RAMAS Metapop have been validated at the program level by checking all subunits and algorithms of the subprograms, by making sure that the program does what is described in the manual, by checking the lower-level algorithms for consistency (e.g., random distributions of vital rates), etc. RAMAS GIS and RAMAS Metapop are not models, but tools which help users build models. The validation at the model level is your responsibility. To validate a model, start with a review of the types of results that the program gives, and make sure that they will address the question you are asking. Next, you should review the assumptions and limitations of the program (see the manual) and convince yourself that these fit the species you are modeling. If they don't fit (or you are not sure whether they do or not) then at least you are aware of these assumptions and limitations. The major component of the validation involves your input of the model parameters into the program. It is essential that you understand how these parameters are used by the program, and not rely on your intuitive understanding based on the names given to various parameters in the program. The same names or symbols may be used in entirely different contexts by different programs or textbooks, leading to several common mistakes. If you have large amounts of data, you can also validate your model by comparing the model predictions with observations (e.g., see Brook et al. 2000). However desirable this type of validation may be, it is difficult and often impossible for stochastic models since you need observations on abundances of several populations as well as their variation and risks of local extinction.

The manual provides several suggestions for testing or checking your model.

 


Program Limitations and Assumptions

Models in RAMAS are expressed in discrete time. The program does not use continuous-time models. We believe that most modeling needs in population dynamics can be met with (and perhaps are most naturally expressed as) discrete-time formulations.

Since RAMAS GIS and RAMAS Metapop models can only address a single species, they cannot explicitly represent competition, predation, mutualism or other interspecific interactions. These interactions can be modeled as constant, stationary (randomly fluctuating with constant mean), or deterministically varying (e.g., cyclic) influences on demographic parameters. In addition, it may be possible to model such interactions in a user-written code linked to the program.

Population dynamics within each population is modeled with a stage- or age-structured matrix model, which may also include sex structure. The model may contain non-linearities describing density dependence relations between matrix elements and the population abundance. The program also allows used-written code to be linked to a model. This provides a lot of flexibility that will let you model most species. However, some very complex structures (for example those requiring complicated rules representing social structure) may be difficult or impossible to incorporate unless they can be simplified to a matrix model.

Within-population correlations among vital rates are restricted to 1 (full), 0 (none) and -1. (Among populations, any correlation between -1 and +1 can be specified.)

A catastrophe can either be regional (occur at the same time for all the populations) or local (occur independently for each population).

If a model includes two different types of catastrophes, the correlation between the two catastrophes (in time) can be zero, maximum positive or maximum negative.

The program does not estimate any input parameters. The manual provides background information, simple examples, and references to the relevant literature to help the users with parameter estimation. The program includes sample files that are parameterized from literature data on several species.

The program makes certain assumptions about the way parameters are combined into the overall model. These are implicit in the equations used in the program, and include dispersal-distance and correlation-distance functions, density-dependent dispersal, density dependence functions, etc. For the exact methods and equations the program uses, see discussions in the manual (particularly the algorithm in Appendix I).

In addition to these limitations, any particular model built in RAMAS will have a set of assumptions, characterized by its parameters (or lack of them). As with any other program or model, how realistic the metapopulation model is largely depends on the amount and type of data. (See "Avoiding mistakes in population modeling"). This program will, hopefully, let the user utilize such data effectively in predicting extinction risks, and perhaps even guide additional field work by helping the user identify parameters to which the risk results seem to be most sensitive. However, no program or model is a substitute for empirical work.


Summary of Transparency Features

There are several features in RAMAS Metapop and RAMAS GIS that are specifically designed to make the programs transparent.

  1. The methods used in the program are described in detail in the manual as well as in several peer-reviewed publications.

  2. The source code of the program is summarized in the form of pseudo-code in a file distributed with the program, as well as in an appendix to the manual.

  3. You can link your own code to the program, with a "user-defined function" option.

  4. Several examples of "user-defined functions" are provided with the program, including their full source codes as well as the compiled versions.

  5. The manual and the help file list the basic assumptions and technical limitations of the program.

  6. The program allows you to print a list of "Model Summary and Assumptions" for your model.

  7. The program displays your input parameters in the form of various graphs that allow you to examine the properties of your model.

  8. The program indicates which input parameters it is not using (e.g., if catastrophe probability is zero, then the magnitude of the catastrophe impact is not relevant).

  9. The manual and the help file include the exact definitions of the model parameters, how they are used by the program, tips on parameter estimation, and suggestions for various methods of testing and validating your model.

  10. The program makes many checks, and displays warnings if it detects potential problems in your model.

  11. The help file provides a list and detailed explanation of these warning messages, as well as suggestions about how to correct your model. 


These help topics make a number of suggestions. Two specific examples are given below.

1. Print a summary of your model (select "Export Model Summary" from the File menu to save a file that you can view and print using your browser), and examine the features and parameters of the model.


2. Examine the structure of the program, which is summarized in the form of pseudo-code in a file distributed with the program (RAMAS Metapop Algorithm.pdf in the program folder). This describes the way parameters are combined into the overall model. It is important that you understand how the program uses each parameter, and not rely on your intuitive understanding based on the parameter's name. The same name or symbol may be used in entirely different contexts by different programs or textbooks.


Also, read about how to check for and correct internal inconsistencies of your model.

Model too complex

Many modelers are tempted to add a lot of complexity to realistically reflect the life history of the species. However, complexity means more parameters to estimate, for which data may not be available. Attempts to include more details than can be justified by the quality of the available data may result in decreased predictive power (Ginzburg and Jensen 2004). In many cases, simple assumptions may be preferable to adding new factors or parameters.

For example, if the number of individuals in older age classes is small, then pooling these data into a "composite age class" (e.g., a 3+ "age" class that includes all individuals that are 3 years old and older) would allow a more precise estimate of the survival rate of these individuals. Of course, this would mean making several assumptions (survival and fecundity do not change after age 3; there is no senescence), which should be compared to what is known about the biology of the species.

Another example is adding various factors (e.g., inbreeding depression, behavioral traits, Allee effects, etc.) that affect the dynamics of very small populations. Although it is preferable to model each of these factors when data are available, often adding these factors involves a lot of guesswork and assumptions. When data are not available, it may be better to simply ignore these factors, but focus on risk of decline to a threshold greater than zero (instead of risk of extinction). When the model includes multiple populations, this can be done in two different ways: (1) use an extinction threshold for the whole population (in Stochasticity dialog), (2) use a local threshold for each population. See the help files and the manual for setting these thresholds, and the difference between the two.

Another problem with complex models is that it is more difficult to interpret or understand their results, some of which may be counter-intuitive. In many cases, even when sufficient data exist to build a complex model, it is advisable to start from a very simple version of your model (e.g., scalar, deterministic, density-independent), and add one feature at a time, running the model and examining its results at each step. Alternatively (if you have already build a complex model), you can remove one feature of the model at a time. A process of addition or elimination may let you understand the factors that contribute to the results, and notice problems in your model.

Each feature is removed in a different way. For example, to remove all variability (all stochastic elements) and run a deterministic simulation, set Replications to 0 (in General information dialog). To only eliminate catastrophes, go to Catastrophes dialog and uncheck each box under "Affects". To remove random fluctuations in vital rates, go to Standard deviations matrix dialog and set all elements to zero by using "Autofill" with CV=0. To remove age/stage structure, go to Stages dialog and delete all but one of the stages, then go to Stage Matrix dialog and enter the population growth rate ("finite rate of increase") as the only "stage matrix" element, then go to Initial abundances dialog to enter the correct value. To remove density dependence, go to Density dependence dialog, select "All populations have the same density dependence type", and set this type to "Exponential". To remove all spatial features, go to Populations dialog and delete all but one of the populations.

Model too simple

Some modelers make the opposite mistake of building a model that is (1) too simple to use all the available data and incorporate all information about the life history of the species, or (2) too simple to properly address the question at hand, or (3) too simple to incorporate all the known threats to the species. The risk in building a very simple model is that it may lead to an underestimate of the risk of decline or extinction (because all the threats may not be modeled) or to an inefficient use of the available information. The question of the appropriate level of complexity (i.e., the trade-off between realism and functionality) depends on:

1. characteristics of the species under study (e.g., its ecology),
2. what you know of the species (the availability of data), and
3. what you want to know or predict (the questions addressed).


For example, if the question to be addressed relates to harvest (hunting, fishing, etc.) of the population, or population management by translocating individuals, the results will probably depend sensitively on which age classes are harvested or translocated. In such cases, a simple model that does not include the population's age structure may not be very useful. Similarly, if different age classes show different physiological sensitivities to a particular toxin, then an ecotoxicological assessment using a structured model would be more appropriate than one that uses a scalar or unstructured model (one that does not have age or stage structure).

Some modelers are under the impression that data from regular census of a population (i.e., "count data") can only be used to develop a scalar model. This impression perhaps arises because a population viability analysis using a scalar model is sometimes (confusingly) called a "count-based PVA." However, if the census is not restricted to only the total population, but also includes a count of individuals in each age class or stage, then such data can be used to develop an age or stage structured model.

An objective way to determine the appropriate level of complexity is to fit statistical models of different complexity to the available data, and compare their goodness of fit using the Akaike Information Criterion (AIC). If you are not sure of the appropriate level of complexity, you may want to use multiple models at different levels of complexity, instead of a single model, and consider the range of results you obtain as part of model uncertainty.

Internal inconsistency (Not paying attention to error messages)

A common mistake is to ignore signs that some aspects of the model may be inconsistent with each other. In the case of RAMAS GIS, these signs include a large number and variety of error and warning messages. Some of these messages are encountered while entering model parameters; others appear while running a simulation. An error message indicates that there is a potential mistake in your model. The warning given by these messages is one of the advantages of using a professional program like RAMAS. So, if you see such a message, read about it in the help files to see what it means, to determine if it needs a corrective action, and to learn what options are suggested for correction.

Another sign of inconsistency is that a parameter you had entered earlier becomes unavailable for editing (the parameter box will become gray, and you will not be able to change the parameter). This indicates that the program is ignoring the parameter, usually because of the value of some other parameter (e.g., if local catastrophe probability is zero, local multiplier that determines catastrophe impact is irrelevant and is ignored). Check each set of input parameters (all items under the Model menu) to see which parameters are not available for editing.

Another way to check for inconsistencies is to examine the various graphs and tables the program uses to summarize the input data. For this, go to the Populations dialog (under the Model menu) and click on the "Display" button, which lets you see a variety of input summaries for each population.

If you are programming your own model (or using the "user-defined density dependence" feature in RAMAS Metapop), internal model inconsistencies may be the most difficult mistakes to notice and correct. Two strategies may help: First, start from a very simple version of your model, and gradually add more complex features, running the model at each step and keeping an eye on any counter-intuitive results. Second, compare your results to those from a model which is developed by someone else but has similar features.

bottom of page