From MFKP_wiki

Jump to: navigation, search

Selection: with tag uncertainty [at least 200 articles] 


Maxent is not a presence-absence method: a comment on Thibaud et al

Methods in Ecology and Evolution, Vol. 5, No. 11. (November 2014), pp. 1192-1197,


[Summary] [::1] Thibaud et al. (Methods in Ecology and Evolution 2014) present a framework for simulating species and evaluating the relative effects of factors affecting the predictions from species distribution models (SDMs). They demonstrate their approach by generating presence–absence data sets for different simulated species and analysing them using four modelling methods: three presence–absence methods and Maxent, which is a presence-background modelling tool. One of their results is striking: that their use of Maxent performs well in estimating occupancy probabilities and even ...


Economic value of ecological information in ecosystem-based natural resource management depends on exploitation history

Proceedings of the National Academy of Sciences, Vol. 115, No. 7. (13 February 2018), pp. 1658-1663,


[Significance] Natural resource management is evolving toward a more holistic approach that acknowledges ecological connections among species. To date, there has been no demonstration of where or when this approach provides economic benefits. Here we find only modest economic benefits from having detailed knowledge of ecological linkages between species. However, the costs of incomplete or incorrect knowledge are unevenly distributed across user groups and are greater after historical overfishing. The ecosystem approach to natural resource management might therefore provide the greatest benefit ...


Discounting... on stilts

The University of Chicago Law Review, Vol. 74, No. 1. (2007), pp. 119-138


[Excerpt] Jeremy Bentham famously described the concept of natural rights as “nonsense upon stilts.” This Response argues that cost-benefit analysis (CBA)—a contemporary applied version of Bentham’s utilitarianism for public policy analysis—is also nonsensical in that CBA purports to resolve questions, the answers to which have already been subsumed within the framework’s architecture. In particular, CBA subsumes vital questions of intergenerational equity through its use of an exponential discount factor to adjust future costs and benefits to a present value. This discounting procedure has the practical effect of dramatically diminishing the apparent ...


The limits of cost/benefit analysis when disasters loom

Global Policy, Vol. 7 (May 2016), pp. 56-66,


[Abstract] Advances in estimating the costs and benefits of climate change policies are a welcome development, but a full-scale cost/benefit analysis that seeks to reduce complex value trade-offs to a single metric of net benefit maximization hides many important public policy issues, especially for disasters and catastrophes that are large, discontinuous, irreversible and uncertain. States should obtain public input on such policies. These policies involve value trade-offs that can be informed by technocratic estimates of costs, benefits and risk. However, such analyses ...


The science of value: economic expertise and the valuation of human life in US federal regulatory agencies

Social Studies of Science, Vol. 47, No. 4. (21 March 2017), pp. 441-465,


This article explores efforts to apply economic logic to human life. To do so, it looks at federal regulatory agencies, where government planners and policy makers have spent over a century trying to devise a scientifically sound way to measure the economic value of lives lost or saved by public programs. The methods they have drawn on, however, have changed drastically in the past 40 years, shifting from a ‘human capital’ approach based on models of economic productivity and producing relatively low ...


Software engineering for computational science: past, present, future

Computing in Science & Engineering (2018), pp. 1-1,


While the importance of in silico experiments for the scientific discovery process increases, state-of-the-art software engineering practices are rarely adopted in computational science. To understand the underlying causes for this situation and to identify ways for improving the current situation, we conduct a literature survey on software engineering practices in computational science. As a result of our survey, we identified 13 recurring key characteristics of scientific software development that can be divided into three groups: characteristics that results (1) from the ...


On the projection of future fire danger conditions with various instantaneous/mean-daily data sources

Climatic Change, Vol. 118, No. 3-4. (2013), pp. 827-840,


Fire danger indices are descriptors of fire potential in a large area, and combine a few variables that affect the initiation, spread and control of forest fires. The Canadian Fire Weather Index (FWI) is one of the most widely used fire danger indices in the world, and it is built upon instantaneous values of temperature, relative humidity and wind velocity at noon, together with 24 hourly accumulated precipitation. However, the scarcity of appropriate data has motivated the use of daily mean ...


Valuing mediterranean forests: towards total economic value

In Valuing mediterranean forests: towards total economic value (2005),


This book provides a comprehensive analysis of the economic value of Mediterranean forests, including not just commonly measured benefits such as timber but also, more importantly, the public goods and externalities they provide. It consists of 25 chapters structured into 3 parts: part 1 provides an overview of the problem and of the approach followed, and summarizes the results; part 2 includes detailed national level case studies of 18 countries and territories bordering the Mediterranean Sea (Morocco, Algeria, Tunisia, Egypt, Palestine, ...


On the systematic reduction of data complexity in multimodel atmospheric dispersion ensemble modeling

Journal of Geophysical Research: Atmospheres, Vol. 117, No. D5. (16 March 2012), pp. n/a-n/a,


The aim of this work is to explore the effectiveness of theoretical information approaches for the reduction of data complexity in multimodel ensemble systems. We first exploit a weak form of independence, i.e. uncorrelation, as a mechanism for detecting linear relationships. Then, stronger and more general forms of independence measure, such as mutual information, are used to investigate dependence structures for model selection. A distance matrix, measuring the interdependence between data, is derived for the investigated measures, with the scope of ...


Guidance on a harmonised framework for pest risk assessment and the identification and evaluation of pest risk management options by EFSA

EFSA Journal, Vol. 8, No. 2. (1 February 2010), 1495,


The Scientific Panel on Plant Health was requested by EFSA to develop a guidance document on a harmonised framework for risk assessment of organisms harmful to plants and plant products and the identification and evaluation of risk management options. The document provides guiding principles on assessment practices and approaches when assessing risks to plant health to support the decision-making process under Council Directive 2000/29/EC. The framework aims at implementing the fundamental principles of risk assessment as laid down in Regulation (EC) ...


Impact of asymmetric uncertainties in ice sheet dynamics on regional sea level projections

Natural Hazards and Earth System Sciences, Vol. 17, No. 12. (04 December 2017), pp. 2125-2141,


Currently a paradigm shift is made from global averaged to spatially variable sea level change (SLC) projections. Traditionally, the contribution from ice sheet mass loss to SLC is considered to be symmetrically distributed. However, several assessments suggest that the probability distribution of dynamical ice sheet mass loss is asymmetrically distributed towards higher SLC values. Here we show how asymmetric probability distributions of dynamical ice sheet mass loss impact the high-end uncertainties of regional SLC projections across the globe. For this purpose ...


A new definition of complexity in a risk analysis setting

Reliability Engineering & System Safety (November 2017),


[Highlights] [::] A new definition of complexity is presented [::] It allows for improved clarity on the links between complexity and risk [::] The idea is to link complexity to activities, and the knowledge about the consequences of these at different levels [Abstract] In this paper, we discuss the concept of complexity in a risk analysis context. Inspired by the work of Johansen and Rausand, a new perspective on complexity is presented which includes several common definitions of complexity as special cases. The idea ...


Environmental and geographic variables are effective surrogates for genetic variation in conservation planning

Proceedings of the National Academy of Sciences, Vol. 114, No. 48. (28 November 2017), pp. 12755-12760,


[Significance] To protect biodiversity for the long term, nature reserves and other protected areas need to represent a broad range of different genetic types. However, genetic data are expensive and time-consuming to obtain. Here we show that freely available environmental and geographic variables can be used as effective surrogates for genetic data in conservation planning. This means that conservation planners can, with some confidence, design protected area systems to represent intraspecific genetic diversity without investing in expensive programs to obtain and analyze ...


Rules of thumb for judging ecological theories

Trends in Ecology & Evolution, Vol. 19, No. 3. (March 2004), pp. 121-126,


An impressive fit to historical data suggests to biologists that a given ecological model is highly valid. Models often achieve this fit at the expense of exaggerated complexity that is not justified by empirical evidence. Because overfitted theories complement the traditional assumption that ecology is `messy', they generally remain unquestioned. Using predation theory as an example, we suggest that a fit-driven appraisal of model value is commonly misdirected; although fit to historical data can be important, the simplicity and generality of ...


The strategy of model building in population biology

American Scientist, Vol. 54, No. 4. (1966), pp. 421-431


[Excerpt: Cluster of models] A mathematical model is neither an hypothesis nor a theory. Unlike the scientific hypothesis, a model is not verifiable directly by experiment. For all models are both true and false. Almost any plausible proposed relation among aspects of nature is likely to be true in the sense that it occurs (although rarely and slightly). Yet all models leave out a lot and are in that sense false, incomplete, inadequate. The validation of a model is not that it ...


Risks of population extinction from demographic and environmental stochasticity and random catastrophes

The American Naturalist, Vol. 142, No. 6. (1 December 1993), pp. 911-927,


Stochastic factors affecting the demography of a single population are analyzed to determine the relative risks of extinction from demographic stochasticity, environmental stochasticity, and random catastrophes. Relative risks are assessed by comparing asymptotic scaling relationships describing how the average time to extinction, T, increases with the carrying capacity of a population, K, under each stochastic factor alone. Stochastic factors are added to a simple model of exponential growth up to K. A critical parameter affecting the extinction dynamics is the ...


Open geospatial data: an assessment of global boundary datasets

In Proceedings of the 20th annual GIS Research UK (GISRUK 2012) (2012), 35


[Excerpt: Conclusion] Through comparison of GAUL, GADM and UNSALB boundary datasets we found that each dataset has advantages and drawbacks in terms of accuracy and usability, but overall GAUL was the best dataset due to the accuracy and completeness of the dataset. While UNSALB boundaries have the highest rate of accuracy because of validation with national mapping agencies, it is limited in geographic scope. Although GADM has a global scale, many of the boundaries are outdated and it is unclear whether GADM organizers have utilized public feedback ...


Global carbon budget 2017

Earth System Science Data Discussions (13 November 2017), pp. 1-79,


Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere – the "global carbon budget" – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify the five major components of the global carbon budget and their uncertainties. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production ...


Towards real-time verification of CO2 emissions

Nature Climate Change, Vol. 7, No. 12. (13 November 2017), pp. 848-850,


The Paris Agreement has increased the incentive to verify reported anthropogenic carbon dioxide emissions with independent Earth system observations. Reliable verification requires a step change in our understanding of carbon cycle variability. [\n] Emissions of CO2 from fossil fuels and industry did not change from 2014 to 2016, yet there was a record increase in CO2 concentration in the atmosphere. This apparent inconsistency is explained by the response of the natural carbon cycle to the 2015–2016 El Niño event, but it raises ...


Bias correction in species distribution models: pooling survey and collection data for multiple species

Methods in Ecology and Evolution, Vol. 6, No. 4. (1 April 2015), pp. 424-438,


[::] Presence-only records may provide data on the distributions of rare species, but commonly suffer from large, unknown biases due to their typically haphazard collection schemes. Presence–absence or count data collected in systematic, planned surveys are more reliable but typically less abundant. [::] We proposed a probabilistic model to allow for joint analysis of presence-only and survey data to exploit their complementary strengths. Our method pools presence-only and presence–absence data for many species and maximizes a joint likelihood, simultaneously estimating and adjusting ...


Point process models for presence-only analysis

Methods in Ecology and Evolution, Vol. 6, No. 4. (1 April 2015), pp. 366-379,


[::] Presence-only data are widely used for species distribution modelling, and point process regression models are a flexible tool that has considerable potential for this problem, when data arise as point events. [::] In this paper, we review point process models, some of their advantages and some common methods of fitting them to presence-only data. [::] Advantages include (and are not limited to) clarification of what the response variable is that is modelled; a framework for choosing the number and location of quadrature ...


How have past fire disturbances contributed to the current carbon balance of boreal ecosystems?

Biogeosciences, Vol. 13, No. 3. (04 February 2016), pp. 675-690,


Boreal fires have immediate effects on regional carbon budgets by emitting CO2 into the atmosphere at the time of burning, but they also have legacy effects by initiating a long-term carbon sink during post-fire vegetation recovery. Quantifying these different effects on the current-day pan-boreal (44–84° N) carbon balance and quantifying relative contributions of legacy sinks by past fires is important for understanding and predicting the carbon dynamics in this region. Here we used the global dynamic vegetation model ORCHIDEE–SPITFIRE (Organising Carbon and ...


The most recent view of vulnerability

In Science for disaster risk management 2017: knowing better and losing less, Vol. 28034 (2017), pp. 70-84


[Excerpt: Conclusions and key messages] Over the past decades, vulnerability research has made considerable progress in understanding some of the root causes and dynamic pressures that influence the progression of vulnerability and raised awareness that disasters are not natural but predominantly a product of social, economic and political conditions (Wisner et al., 2004). [\n] Vulnerability assessments are a response to the call for evidence by decision- makers for use in pre-disaster risk assessment, prevention and reduction, as well as the development and implementation of appropriate preparedness and effective disaster response strategies by providing information on people, communities or regions at risk. [\n] ...


  1. Alexander, D., Magni, M., 2013. Mortality in the L'Aquila ( Central Italy ) Earthquake of 6 April 2009. PLOS Current Disasters, (April 2009).
  2. Alexander, D., 2010. The L'Aquila Earthquake of 6 April 2009 and Italian Government Policy on Disaster Response. Journal of Natural Resources Policy Research, 2(4), 325–342.
  3. Alexander, D., 2013. Resilience and disaster risk reduction: An etymological journey. Natural Hazards and Earth System Sciences, 13 (11), 2707–2716.

Limiting global warming to 1.5 °C may still be possible

Nature (18 September 2017),


Analysis suggests that researchers have underestimated how much carbon humanity can emit before reaching this level of warming. [Excerpt] A team of climate scientists has delivered a rare bit of good news: it could be easier than previously thought to limit global warming to 1.5 °C above pre-industrial levels, as called for in the 2015 Paris climate agreement. But even if the team is right — and some researchers are already questioning the conclusions — heroic efforts to curb greenhouse-gas emissions will ...


The concept of potential natural vegetation: an epitaph?

Journal of Vegetation Science, Vol. 21, No. 6. (December 2010), pp. 1172-1178,


We discuss the usefulness of the concept of Potential Natural Vegetation (PNV), which describes the expected state of mature vegetation in the absence of human intervention. We argue that it is impossible to model PNV because of (i) the methodological problems associated to its definition and (ii) the issues related to the ecosystems dynamics.We conclude that the approach to characterizing PNV is unrealistic and provides scenarios with limited predictive power. In places with a long-term human history, interpretations of PNV need ...


Fears rise for US climate report as Trump officials take reins

Nature, Vol. 548, No. 7665. (1 August 2017), pp. 15-16,


Officials at the US Environmental Protection Agency are consulting global-warming sceptics as they weigh up a technical review. ...


Big names in statistics want to shake up much-maligned P value

Nature, Vol. 548, No. 7665. (26 July 2017), pp. 16-17,


One of scientists’ favourite statistics — the P value — should face tougher standards, say leading researchers. [Excerpt] Science is in the throes of a reproducibility crisis, and researchers, funders and publishers are increasingly worried that the scholarly literature is littered with unreliable results. Now, a group of 72 prominent researchers is targeting what they say is one cause of the problem: weak statistical standards of evidence for claiming new discoveries. [\n] In many disciplines the significance of findings is judged by ...


Little evidence for fire-adapted plant traits in Mediterranean climate regions

Trends in Plant Science, Vol. 16, No. 2. (20 February 2011), pp. 69-76,


As climate change increases vegetation combustibility, humans are impacted by wildfires through loss of lives and property, leading to an increased emphasis on prescribed burning practices to reduce hazards. A key and pervading concept accepted by most environmental managers is that combustible ecosystems have traditionally burnt because plants are fire adapted. In this opinion article, we explore the concept of plant traits adapted to fire in Mediterranean climates. In the light of major threats to biodiversity conservation, we recommend caution in ...


An empirical comparison of model validation techniques for defect prediction models

IEEE Transactions on Software Engineering, Vol. 43, No. 1. (1 January 2017), pp. 1-18,


Defect prediction models help software quality assurance teams to allocate their limited resources to the most defect-prone modules. Model validation techniques, such as k -fold cross-validation, use historical data to estimate how well a model will perform in the future. However, little is known about how accurate the estimates of model validation techniques tend to be. In this paper, we investigate the bias and variance of model validation techniques in the domain of defect prediction. Analysis of 101 public defect datasets ...


Resampling methods for meta-model validation with recommendations for evolutionary computation

Evolutionary Computation, Vol. 20, No. 2. (16 February 2012), pp. 249-275,


Meta-modeling has become a crucial tool in solving expensive optimization problems. Much of the work in the past has focused on finding a good regression method to model the fitness function. Examples include classical linear regression, splines, neural networks, Kriging and support vector regression. This paper specifically draws attention to the fact that assessing model accuracy is a crucial aspect in the meta-modeling framework. Resampling strategies such as cross-validation, subsampling, bootstrapping, and nested resampling are prominent methods for model validation and ...


Combining multiple classifiers: an application using spatial and remotely sensed information for land cover type mapping

Remote Sensing of Environment, Vol. 74, No. 3. (December 2000), pp. 545-556,


This article discusses two new methods for increasing the accuracy of classifiers used land cover mapping. The first method, called the product rule, is a simple and general method of combining two or more classification rules as a single rule. Stacked regression methods of combining classification rules are discussed and compared to the product rule. The second method of increasing classifier accuracy is a simple nonparametric classifier that uses spatial information for classification. Two data sets used for land cover mapping ...


Bagging ensemble selection for regression

In AI 2012: Advances in Artificial Intelligence, Vol. 7691 (2012), pp. 695-706,


Bagging ensemble selection (BES) is a relatively new ensemble learning strategy. The strategy can be seen as an ensemble of the ensemble selection from libraries of models (ES) strategy. Previous experimental results on binary classification problems have shown that using random trees as base classifiers, BES-OOB (the most successful variant of BES) is competitive with (and in many cases, superior to) other ensemble learning strategies, for instance, the original ES algorithm, stacking with linear regression, random forests or boosting. Motivated by ...


Bagging ensemble selection

In AI 2011: Advances in Artificial Intelligence, Vol. 7106 (2011), pp. 251-260,


Ensemble selection has recently appeared as a popular ensemble learning method, not only because its implementation is fairly straightforward, but also due to its excellent predictive performance on practical problems. The method has been highlighted in winning solutions of many data mining competitions, such as the Netflix competition, the KDD Cup 2009 and 2010, the UCSD FICO contest 2010, and a number of data mining competitions on the Kaggle platform. In this paper we present a novel variant: bagging ensemble selection. ...


When the appeal of a dominant leader is greater than a prestige leader

Proceedings of the National Academy of Sciences, Vol. 114, No. 26. (27 June 2017), pp. 6734-6739,


[Significance] We examine why dominant/authoritarian leaders attract support despite the presence of other admired/respected candidates. Although evolutionary psychology supports both dominance and prestige as viable routes for attaining influential leadership positions, extant research lacks theoretical clarity explaining when and why dominant leaders are preferred. Across three large-scale studies we provide robust evidence showing how economic uncertainty affects individuals’ psychological feelings of lack of personal control, resulting in a greater preference for dominant leaders. This research offers important theoretical explanations for why, around ...


Global environmental issues and the emergence of Second Order Science

Vol. 12803 (1990)


[Excerpt: Introduction] The fundamental achievements of science, like those of all creative activities, have a timeless quality. The social activity of science, like any other, evolves in response to its changing circumstances, in its objects, methods and social functions. In the high Middle Ages, the independence of secular learning was established in the universities, removed from the monasteries; and the boundary between the sacred and private on the one hand, and the secular and public on the other, was set for ...


Fuzziness vs. probability

International Journal of General Systems, Vol. 17, No. 2-3. (June 1990), pp. 211-240,


Fuzziness is explored as an alternative to randomness for describing uncertainty. The new sets-as-points geometric view of fuzzy sets is developed. This view identifies a fuzzy set with a point in a unit hypercube and a nonfuzzy set with a vertex of the cube. Paradoxes of two-valued logic and set theory, such as Russell's paradox, correspond to the midpoint of the fuzzy cube. The fundamental questions of fuzzy theory—How fuzzy is a fuzzy set? How much is one fuzzy set a ...


Building confidence in climate model projections: an analysis of inferences from fit

WIREs Clim Change, Vol. 8, No. 3. (1 May 2017), n/a,


Climate model projections are used to inform policy decisions and constitute a major focus of climate research. Confidence in climate projections relies on the adequacy of climate models for those projections. The question of how to argue for the adequacy of models for climate projections has not gotten sufficient attention in the climate modeling community. The most common way to evaluate a climate model is to assess in a quantitative way degrees of ‘model fit’; that is, how well model results ...


Novel climates, no-analog communities, and ecological surprises

Frontiers in Ecology and the Environment, Vol. 5, No. 9. (November 2007), pp. 475-482,


No-analog communities (communities that are compositionally unlike any found today) occurred frequently in the past and will develop in the greenhouse world of the future. The well documented no-analog plant communities of late-glacial North America are closely linked to “novel” climates also lacking modern analogs, characterized by high seasonality of temperature. In climate simulations for the Intergovernmental Panel on Climate Change A2 and B1 emission scenarios, novel climates arise by 2100 AD, primarily in tropical and subtropical regions. These future novel ...


Multispecies coalescent delimits structure, not species

Proceedings of the National Academy of Sciences, Vol. 114, No. 7. (14 February 2017), pp. 1607-1612,


[Significance] Despite its widespread application to the species delimitation problem, our study demonstrates that what the multispecies coalescent actually delimits is structure. The current implementations of species delimitation under the multispecies coalescent do not provide any way for distinguishing between structure due to population-level processes and that due to species boundaries. The overinflation of species due to the misidentification of general genetic structure for species boundaries has profound implications for our understanding of the generation and dynamics of biodiversity, because any ecological ...


A review of the combination among global change factors in forests, shrublands and pastures of the Mediterranean Region: beyond drought effects

Global and Planetary Change, Vol. 148 (January 2017), pp. 42-54,


[Highlights] [::] Different global change factors combine causing unprecedented ecological effects. [::] Much more complex interactions arise when combinations occur together. [::] Drought should be considered when designing and applying management policies. [::] Conserving Mediterranean terrestrial ecosystems is a collective effort. [Abstract] Climate change, alteration of atmospheric composition, land abandonment in some areas and land use intensification in others, wildfires and biological invasions threaten forests, shrublands and pastures all over the world. However, the impacts of the combinations between global change factors are not well understood despite ...


Viewing forests through the lens of complex systems science

Ecosphere, Vol. 5, No. 1. (January 2014), art1,


Complex systems science provides a transdisciplinary framework to study systems characterized by (1) heterogeneity, (2) hierarchy, (3) self-organization, (4) openness, (5) adaptation, (6) memory, (7) non-linearity, and (8) uncertainty. Complex systems thinking has inspired both theory and applied strategies for improving ecosystem resilience and adaptability, but applications in forest ecology and management are just beginning to emerge. We review the properties of complex systems using four well-studied forest biomes (temperate, boreal, tropical and Mediterranean) as examples. The lens of complex systems ...


From management to stewardship: viewing forests as complex adaptive systems in an uncertain world

Conservation Letters, Vol. 8, No. 5. (September 2015), pp. 368-377,


The world's forests and forestry sector are facing unprecedented biological, political, social, and climatic challenges. The development of appropriate, novel forest management and restoration approaches that adequately consider uncertainty and adaptability are hampered by a continuing focus on production of a few goods or objectives, strong control of forest structure and composition, and most importantly the absence of a global scientific framework and long-term vision. Ecosystem-based approaches represent a step in the right direction, but are limited in their ability to ...


The ability of climate envelope models to predict the effect of climate change on species distributions

Global Change Biology, Vol. 12, No. 12. (1 December 2006), pp. 2272-2281,


Climate envelope models (CEMs) have been used to predict the distribution of species under current, past, and future climatic conditions by inferring a species' environmental requirements from localities where it is currently known to occur. CEMs can be evaluated for their ability to predict current species distributions but it is unclear whether models that are successful in predicting current distributions are equally successful in predicting distributions under different climates (i.e. different regions or time periods). We evaluated the ability of CEMs ...


Ecological responses to recent climate change

Nature, Vol. 416 (2002), pp. 389-395,


There is now ample evidence of the ecological impacts of recent climate change, from polar terrestrial to tropical marine environments. The responses of both flora and fauna span an array of ecosystems and organizational hierarchies, from the species to the community levels. Despite continued uncertainty as to community and ecosystem trajectories under global change, our review exposes a coherent pattern of ecological change across systems. Although we are only at an early stage in the projected trends of global warming, ecological ...


Keep it complex

Nature, Vol. 468, No. 7327. (23 December 2010), pp. 1029-1031,


When knowledge is uncertain, experts should avoid pressures to simplify their advice. Render decision-makers accountable for decisions, says Andy Stirling. ...


Sample selection bias and presence-only distribution models: implications for background and pseudo-absence data

Ecological Applications, Vol. 19, No. 1. (January 2009), pp. 181-197,


Most methods for modeling species distributions from occurrence records require additional data representing the range of environmental conditions in the modeled region. These data, called background or pseudo-absence data, are usually drawn at random from the entire region, whereas occurrence collection is often spatially biased toward easily accessed areas. Since the spatial bias generally results in environmental bias, the difference between occurrence collection and background sampling may lead to inaccurate models. To correct the estimation, we propose choosing background data with ...


Model-based uncertainty in species range prediction

Journal of Biogeography, Vol. 33, No. 10. (October 2006), pp. 1704-1711,


[Aim]  Many attempts to predict the potential range of species rely on environmental niche (or ‘bioclimate envelope’) modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. [Location]  The Western Cape of South Africa. [Methods]  We applied nine of the most widely used modelling techniques to model potential ...


Predicting the impacts of climate change on the distribution of species: are bioclimate envelope models useful?

Global Ecology and Biogeography, Vol. 12, No. 5. (1 September 2003), pp. 361-371,


Modelling strategies for predicting the potential impacts of climate change on the natural distribution of species have often focused on the characterization of a species’ bioclimate envelope. A number of recent critiques have questioned the validity of this approach by pointing to the many factors other than climate that play an important part in determining species distributions and the dynamics of distribution changes. Such factors include biotic interactions, evolutionary change and dispersal ability. This paper reviews and evaluates criticisms of bioclimate ...


Improving generalized regression analysis for the spatial prediction of forest communities

Journal of Biogeography, Vol. 33, No. 10. (October 2006), pp. 1729-1749,


Abstract Aim  This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location  State of Vaud, ...


Modeling the probability of resource use: the effect of, and dealing with, detecting a species imperfectly

Journal of Wildlife Management, Vol. 70, No. 2. (1 April 2006), pp. 367-374,[367:mtporu];2


Resource-selection probability functions and occupancy models are powerful methods of identifying areas within a landscape that are highly used by a species. One common design/analysis method for estimation of a resource-selection probability function is to classify a sample of units as used or unused and estimate the probability of use as a function of independent variables using, for example, logistic regression. This method requires that resource units are correctly classified as unused (i.e., the species is never undetected in a used ...

This page of the database may be cited as:
Integrated Natural Resources Modelling and Management - Meta-information Database.

Result page: 1 2 3 4 5 6 7 Next

Publication metadata

Bibtex, RIS, RSS/XML feed, Json, Dublin Core

Meta-information Database (INRMM-MiD).
This database integrates a dedicated meta-information database in CiteULike (the CiteULike INRMM Group) with the meta-information available in Google Scholar, CrossRef and DataCite. The Altmetric database with Article-Level Metrics is also harvested. Part of the provided semantic content (machine-readable) is made even human-readable thanks to the DCMI Dublin Core viewer. Digital preservation of the meta-information indexed within the INRMM-MiD publication records is implemented thanks to the Internet Archive.
The library of INRMM related pubblications may be quickly accessed with the following links.
Search within the whole INRMM meta-information database:
Search only within the INRMM-MiD publication records:
Full-text and abstracts of the publications indexed by the INRMM meta-information database are copyrighted by the respective publishers/authors. They are subject to all applicable copyright protection. The conditions of use of each indexed publication is defined by its copyright owner. Please, be aware that the indexed meta-information entirely relies on voluntary work and constitutes a quite incomplete and not homogeneous work-in-progress.
INRMM-MiD was experimentally established by the Maieutike Research Initiative in 2008 and then improved with the help of several volunteers (with a major technical upgrade in 2011). This new integrated interface is operational since 2014.