From MFKP_wiki

Jump to: navigation, search

Selection: with tag mathematics [78 articles] 


Aversion to ambiguity and model misspecification in dynamic stochastic environments

Proceedings of the National Academy of Sciences, Vol. 115, No. 37. (11 September 2018), pp. 9163-9168,


[Significance] In many dynamic economic settings, a decision maker finds it challenging to quantify the uncertainty or assess the potential for mistakes in models. We explore alternative ways of acknowledging these challenges by drawing on insights from decision theory as conceptualized in statistics, engineering, and economics. We suggest tractable and revealing ways to incorporate behavioral responses to uncertainty, broadly conceived. Our analysis adopts recursive intertemporal preferences for decision makers that allow them to be ambiguity averse and concerned about the potential misspecification ...


Kernel-based measures of association

Wiley Interdisciplinary Reviews: Computational Statistics, Vol. 10, No. 2. (March 2018), e1422,


Measures of association have been widely used for describing statistical relationships between two sets of variables. Traditionally, such association measures focus on specialized settings. Based on an in‐depth summary of existing common measures, we present a general framework for association measures that unifies existing methods and novel extensions based on kernels, including practical solutions to computational challenges. Specifically, we introduce association screening and variable selection via maximizing kernel‐based association measures. We also develop a backward dropping procedure for feature selection when ...


First- and second-order conservative remapping schemes for grids in spherical coordinates

Monthly Weather Review In Monthly Weather Review, Vol. 127, No. 9. (1 September 1999), pp. 2204-2210,<2204:fasocr>;2


Coupling atmosphere, ocean, sea ice, and land surface models requires a means for remapping fields between grids in an accurate and conservative manner. A method is described here for computing interpolation weights for first- and second-order conservative remappings. The method is completely general and can be used for any grid on a sphere. ...


Hierarchical Bayesian modeling

In Subjective and Objective Bayesian Statistics: Principles, Models, and Applications, Second Edition (25 November 2002), pp. 336-358,
edited by S. James Press


[Excerpt: Introduction] Hierarchical modeling is a widely used approach to building complex models by specifying a series of more simple conditional distributions. It naturally lends itself to Bayesian inference, especially using modern tools for Bayesian computation. In this chapter we first present essential concepts of hierarchical modeling, and then suggest its generality by presenting a series of widely used specific models. [...] [\n] [...] [Summary] In this chapter we have introduced hierarchical modeling as a very general approach to specifying complex models through a ...


The lack of a priori distinctions between learning algorithms

Neural Computation, Vol. 8, No. 7. (1 October 1996), pp. 1341-1390,


This is the first of two papers that use off-training set (OTS) error to investigate the assumption-free relationship between learning algorithms. This first paper discusses the senses in which there are no a priori distinctions between learning algorithms. (The second paper discusses the senses in which there are such distinctions.) In this first paper it is shown, loosely speaking, that for any two algorithms A and B, there are “as many” targets (or priors over targets) for which A has lower ...


Maryam Mirzakhani (1977–2017)

Science, Vol. 357, No. 6353. (25 August 2017), pp. 758-758,


On 14 July, Maryam Mirzakhani, a luminary in pure mathematics, died of cancer at the age of 40. Her achievements had been most recently honored in 2014 by the Fields Medal, the most prestigious award in mathematics. [Excerpt] [...] There are many possible uniformly curved shapes into which the surface can be bent. These shapes are called hyperbolic metrics and exhibit the non-Euclidean geometry discovered in the 1800s after 2000 years of attempts to prove its nonexistence. The plethora of all possible ...



In NIST/SEMATECH e-Handbook of Statistical Methods (2012),


[Excerpt: Definitions of order statistics and ranks] For a series of measurements Y1, …, YN, denote the data ordered in increasing order of magnitude by Y〈1〉, …, Y〈N〉. These ordered data are called order statistics. If Y〈j〉 is the order statistic that corresponds to the measurement Yᵢ, then the rank for Yᵢ is j; i.e., [::] Y〈j〉 ∼ Yᵢ, rᵢ=j. [Definition of percentiles] Order statistics provide a way of estimating proportions of the data that should fall above and below a ...


Sample quantiles in statistical packages

The American Statistician, Vol. 50, No. 4. (1 November 1996), pp. 361-365,


There are a large number of different definitions used for sample quantiles in statistical computer packages. Often within the same package one definition will be used to compute a quantile explicitly, while other definitions may be used when producing a boxplot, a probability plot, or a QQ plot. We compare the most commonly implemented sample quantile definitions by writing them in a common notation and investigating their motivation and some of their properties. We argue that there is a need to ...


Detecting long-range correlations with detrended fluctuation analysis

Physica A: Statistical Mechanics and its Applications, Vol. 295, No. 3-4. (June 2001), pp. 441-454,


We examine the detrended fluctuation analysis (DFA), which is a well-established method for the detection of long-range correlations in time series. We show that deviations from scaling which appear at small time scales become stronger in higher orders of DFA, and suggest a modified DFA method to remove them. The improvement is necessary especially for short records that are affected by non-stationarities. Furthermore, we describe how crossovers in the correlation behavior can be detected reliably and determined quantitatively and show how ...


Statistical significance of seasonal warming/cooling trends

Proceedings of the National Academy of Sciences, Vol. 114, No. 15. (11 April 2017), pp. E2998-E3003,


[Significance] The question whether a seasonal climatic trend (e.g., the increase of spring temperatures in Antarctica in the last decades) is of anthropogenic or natural origin is of great importance because seasonal climatic trends may considerably affect ecological systems, agricultural yields, and human societies. Previous studies assumed that the seasonal records can be treated as independent and are characterized by short-term memory only. Here we show that both assumptions, which may lead to a considerable overestimation of the trend significance, do not ...


General variational approach to the interpolation problem

Computers & Mathematics with Applications, Vol. 16, No. 12. (1988), pp. 983-992,


The Talmi and Gilat variational approach to the interpolation problem in arbitrary dimension is presented together with the corresponding physical model. The connection of this approach to some known spline methods is demonstrated and new interpolation functions are derived for one-, two- and three-dimensional cases. They are designed to be flexible through the use of meaningful parameters and to give good approximations of both the function itself and its derivatives as well. ...


Smooth interpolation of scattered data by local thin plate splines

Computers & Mathematics with Applications, Vol. 8, No. 4. (1982), pp. 273-281,


An algorithm and the corresponding computer program for solution of the scattered data interpolation problem is described. Given points (xk, yk, fk), k = 1,…,N a locally defined function F(x, y) which has the property F(xk, yk) = fk, k = 1,…,N is constructed. The algorithm is based on a weighted sum of locally defined thin plate splines, and yields an interpolation function which is differentiable. The program is available from the author. ...


Multivariate binary discrimination by the kernel method

Biometrika, Vol. 63, No. 3. (1 December 1976), pp. 413-420,


An extension of the kernel method of density estimation from continuous to multivariate binary spaces is described. Its simple nonparametric nature together with its consistency properties make it an attractive tool in discrimination problems, with some advantages over already proposed parametric counterparts. The method is illustrated by an application to a particular medical diagnostic problem. Simple extensions of the method to categorical data and to data of mixed binary and continuous form are indicated. ...


Can one take the logarithm or the sine of a dimensioned quantity or a unit? Dimensional analysis involving transcendental functions

Journal of Chemical Education, Vol. 88, No. 1. (1 January 2011), pp. 67-70,


The fate of dimensions of dimensioned quantities that are inserted into the argument of transcendental functions such as logarithms, exponentiation, trigonometric, and hyperbolic functions is discussed. Emphasis is placed on common misconceptions that are not often systematically examined in undergraduate courses of physical sciences. The argument of dimensional inhomogeneity of the terms of a Taylor expansion of a transcendental function presented in some nonpeer-reviewed popular Internet sites is shown to be false. ...


Globally effective questioning in the Analytic Hierarchy Process

European Journal of Operational Research, Vol. 48, No. 1. (September 1990), pp. 88-97,


A drawback in the use of the Analytic Hierarchy Process (AHP) is the effort required to complete all pairwise comparisons in large hierarchies. The Incomplete Pairwise Comparison (IPC) technique developed by Harker [1,2] aims at reducing this effort by ordering the questions in decreasing informational value and by stopping the process when added value of questions decreases below a certain level. This paper proposes further opportunities for effort reduction through globally effective elicitation process. A simple example demonstrates impressive savings in ...


Towards a web-based collaborative weighting method in project

In IEEE International Conference on Systems, Man and Cybernetics (2002), 6,


For the purpose of a product design or a project in general, weighting a set of comparable criteria has been proven to be of utmost importance (e.g. weighting product functions in value analysis - VA -, and allocating a budget in a Design-To-Cost project). Moreover the weighting problem is related to basic properties in the field of Multi-Criteria Decision Analysis (MCDA) through the notions of ordinal transitivity and rationality in the designers' mind. How should designers or project agents decide in ...


ePiX tutorial and reference manual



[Excerpt: Introduction] ePiX, a collection of batch utilities, creates mathematically accurate figures, plots, and animations containing LATEX typography. The input syntax is easy to learn, and the user interface resembles that of LATEX itself: You prepare a scene description in a text editor, then “compile” the input file into a picture. LATEX- and web-compatible output types include a LATEX picture-like environment written with PSTricks, tikz, or eepic macros; vector images (eps, ps, and pdf); and bitmapped images and movies (png, mng, and gif). [\n] ePiX’s strengths include: [::] Quality of ...


Partial distance correlation with methods for dissimilarities

The Annals of Statistics, Vol. 42, No. 6. (December 2014), pp. 2382-2412,


Distance covariance and distance correlation are scalar coefficients that characterize independence of random vectors in arbitrary dimension. Properties, extensions and applications of distance correlation have been discussed in the recent literature, but the problem of defining the partial distance correlation has remained an open question of considerable interest. The problem of partial distance correlation is more complex than partial correlation partly because the squared distance covariance is not an inner product in the usual linear space. For the definition of partial ...


Energy distance

Wiley Interdisciplinary Reviews: Computational Statistics, Vol. 8, No. 1. (January 2016), pp. 27-38,


Energy distance is a metric that measures the distance between the distributions of random vectors. Energy distance is zero if and only if the distributions are identical, thus it characterizes equality of distributions and provides a theoretical foundation for statistical inference and analysis. Energy statistics are functions of distances between observations in metric spaces. As a statistic, energy distance can be applied to measure the difference between a sample and a hypothesized distribution or the difference between two or more samples ...



In Free Software Directory (July 2009)


An optimal interpolation toolbox for Octave. This package provides functions to perform a n-dimensional optimal interpolations of arbitrarily distributed data points ...


(INRMM-MiD internal record) List of keywords of the INRMM meta-information database - part 24

(February 2014)
Keywords: inrmm-list-of-tags   mass-extinction   mass-spectrometry   mast-fruiting   mastixioideae   mastrave-modelling-library   mathematical-reasoning   mathematics   mating-pattern   matlab   matsucoccus-feytaudi   mattesia-schwenkei   mature-forest   mauritia-flexuosa   max-temperature   maxent   maximum-habitat-suitability   maxwell-quantity-notation   mcarthur-mark-5   mcd43   mcd43a   mcpfe   meadow   meadows   meat-consumption   mechanical-testing   mechanics   mechanistic-approach   medetera-signaticornis   median   mediawiki   medicago-arborea   medical-herb   medicinal-plants   mediterranean-pines   mediterranean-region   medium-resolution   megastigmus-brevicaudis   megastigmus-spp   megastigmus-wachtli   mekong-river-basin   melaleuca-quinquenervia   melampsora   melampsora-larici-populina   melanophila-picta   meles-meles   melia-azedarach   melia-spp   melting-acceleration   memory   mercurialis-perennis   mercury   merit-dem   mersenne-twister   mesoamerica   mesophilous   mesophytic-species   mespilus-germanica   messerschmidia-argentea   meta-analysis   metacommunities   metadata   metadata-mining   metaecosystems   metaknowledge   metal-pollution   metamaterial   metapopulations   metaprogramming   metasequoia-glyptostroboides   meteorology   methane   methods   metopium-toxiferum   metrology   metrosideros-polymorpha   mexico   mic   micology   microalgae   microclimate   micropterus-salmoides   microsatellite   microsite   microsoft-academic-search   mid-holocene   middle-east   migration   migration-history   migration-pattern   migration-rate   milicia-excelsa   millennium-ecosystem-assessment   milliferous-plant   milvus-migrans   milvus-milvus   min-max   min-temperature   mineralization   minimal-predicted-area  


List of indexed keywords within the transdisciplinary set of domains which relate to the Integrated Natural Resources Modelling and Management (INRMM). In particular, the list of keywords maps the semantic tags in the INRMM Meta-information Database (INRMM-MiD). [\n] The INRMM-MiD records providing this list are accessible by the special tag: inrmm-list-of-tags ( ). ...


Corrigendum to "Modelling soil erosion at European scale: towards harmonization and reproducibility" published in Nat. Hazards Earth Syst. Sci.,15, 225-245, 2015

Natural Hazards and Earth System Science, Vol. 15, No. 2. (16 February 2015), pp. 291-291,


[Excerpt] Two editorial mistakes were found in the article. Both refer to Eq. (2), p. 231 (whose correct version was published in the discussion paper, p. 2652). [\n] The first mistake is related to the operator Ω, which was wrongly rendered with a summation operator (Σ). The editorial notation mistake is also evident by considering the semantics of the RDS (relative distance similarity) statistics. As explained in de Rigo et al. (2013) and Bosco et al. (2013), RDS is defined in [0, 1]. Therefore, a summation operator whose arguments are quantities ...


A computational framework for generalized moving windows and its application to landscape pattern analysis

International Journal of Applied Earth Observation and Geoinformation, Vol. 44 (February 2016), pp. 205-216,


[Highlights] [::] Moving window analysis is a prominent means of analyzing the spatial variability of landscape patterns at multiple scales. [::] A new computational framework is presented that overcomes technical and computational barriers to the use and implementation of moving windows based landscape pattern analysis of raster maps. [::] For a small window of 41 × 41 pixels, computation time was reduced by a factor 600 compared to the most commonly used software. These gains will be greater for larger windows. [::] The framework facilitates ...


Read before you cite!

Complex Systems, Vol. 14, No. 3. (2003), pp. 269-274


We report a method of estimating what percentage of people who cited a paper had actually read it. The method is based on a stochastic modeling of the citation process that explains empirical studies of misprint distributions in citations (which we show follows a Zipf law). Our estimate is only about 20% of citers read the original. ...


Nonlinear Component Analysis as a Kernel Eigenvalue Problem

Neural Computation In Neural Computation, Vol. 10, No. 5. (1 July 1998), pp. 1299-1319,


A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map - for instance, the space of all possible five-pixel products in 16 x 16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition. A new method for performing a nonlinear form ...


Nonlinear principal component analysis: neural network models and applications



Nonlinear principal component analysis (NLPCA) as a nonlinear generalisation of standard principal component analysis (PCA) means to generalise the principal components from straight lines to curves. This chapter aims to provide an extensive description of the autoassociative neural network approach for NLPCA. Several network architectures will be discussed including the hierarchical, the circular, and the inverse model with special emphasis to missing data. Results are shown from applications in the field of molecular biology. This includes metabolite data analysis of a ...


Nonlinear PCA: a new hierarchical approach

In Proceedings of the 10th European Symposium on Artificial Neural Networks (ESANN) (2002), pp. 439-444

Measuring and testing dependence by correlation of distances

The Annals of Statistics, Vol. 35, No. 6. (28 December 2007), pp. 2769-2794,


Distance correlation is a new measure of dependence between random vectors. Distance covariance and distance correlation are analogous to product-moment covariance and correlation, but unlike the classical definition of correlation, distance correlation is zero only if the random vectors are independent. The empirical distance dependence measures are based on certain Euclidean distances between sample elements rather than sample moments, yet have a compact representation analogous to the classical covariance and correlation. Asymptotic properties and applications in testing independence are discussed. Implementation of the test and Monte Carlo results are ...


Brownian distance covariance

The Annals of Applied Statistics, Vol. 3, No. 4. (6 Oct 2010), pp. 1236-1265,


Distance correlation is a new class of multivariate dependence coefficients applicable to random vectors of arbitrary and not necessarily equal dimension. Distance covariance and distance correlation are analogous to product-moment covariance and correlation, but generalize and extend these classical bivariate measures of dependence. Distance correlation characterizes independence: it is zero if and only if the random vectors are independent. The notion of covariance with respect to a stochastic process is introduced, and it is shown that population distance covariance coincides with the covariance with respect to Brownian motion; thus, ...


Rejoinder: brownian distance covariance

The Annals of Applied Statistics, Vol. 3, No. 4. (5 Oct 2010), pp. 1303-1308,


Rejoinder to "Brownian distance covariance" by Gábor J. Székely and Maria L. Rizzo [arXiv:1010.0297] ...


Emergence of patterns in random processes

Physical Review E, Vol. 86, No. 2. (August 2012), 026103,


Sixty years ago, it was observed that any independent and identically distributed (i.i.d.) random variable would produce a pattern of peak-to-peak sequences with, on average, three events per sequence. This outcome was employed to show that randomness could yield, as a null hypothesis for animal populations, an explanation for their apparent 3-year cycles. We show how we can explicitly obtain a universal distribution of the lengths of peak-to-peak sequences in time series and that this can be employed for long data ...


Twenty questions for Donald Knuth

INFORMIT, Vol. 2014 (2014), 2213858


To celebrate the publication of the eBooks of The Art of Computer Programming, (TAOCP), we asked several computer scientists, contemporaries, colleagues, and well-wishers to pose one question each to author Donald E. Knuth. Here are his answers. ...


Clustering by fast search and find of density peaks

Science, Vol. 344, No. 6191. (26 June 2014), pp. 1492-1496,


[Abstract] Cluster analysis is aimed at classifying elements into categories on the basis of their similarity. Its applications range from astronomy to bioinformatics, bibliometrics, and pattern recognition. We propose an approach based on the idea that cluster centers are characterized by a higher density than their neighbors and by a relatively large distance from points with higher densities. This idea forms the basis of a clustering procedure in which the number of clusters arises intuitively, outliers are automatically spotted and excluded ...


IEEE Standard for Floating-Point Arithmetic

IEEE Std 754-2008 In IEEE Std 754-2008 (August 2008), pp. 1-70,
Keywords: ieee   mathematics   multiauthor   programming   standard  


This standard specifies interchange and arithmetic formats and methods for binary and decimal floating-point arithmetic in computer programming environments. This standard specifies exception conditions and their default handling. An implementation of a floating-point system conforming to this standard may be realized entirely in software, entirely in hardware, or in any combination of software and hardware. For operations specified in the normative part of this standard, numerical results and exceptions are uniquely determined by the values of the input data, sequence of ...


An American National Standard- IEEE Standard for Binary Floating-Point Arithmetic

ANSI/IEEE Std 754-1985 In ANSI/IEEE Std 754-1985, No. Std 754-1985. (1985), 0_1,
by John E. May, John P. Riganati, Sava I. Sherr, James H. Beall, Fletcher J. Buckley, Rene Castenschiold, Edward Chelotti, Edward J. Cohen, Paul G. Cummings, Donald C. Fleckenstein, Jay Forster, Daniel L. Goldberg, Kenneth D. Hendrix, Irvin N. Howell, Jack Kinn, Joseph L. Koepfinger, Irving Kolodny, R. F. Lawrence, Lawrence V. McCall, Donald T. Michael, Frank L. Rose, Clifford O. Swanson, J. Richard Weger, W. B. Wilkens, Charles J. Wylie, Andrew Allison, William Ames, Mike Arya, Janis Baron, Steve Baumel, Dileep Bhandarkar, Joel Boney, E. H. Bristol, Werner Buchholz, Jim Bunch, Ed Burdick, Gary R. Burke, Paul Clemente, W. J. Cody, Jerome T. Coonen, Jim Crapuchettes, Itzhak Davidesko, Wayne Davison, R. H. Delp, James Demmel, Donn Denman, Alvin Despain, Augustin A. Dubrulle, Tom Eggers, Philip J. Faillace, Richard Fateman, David Feign, Don Feinberg, Smart Feldman, Eugene Fisher, Paul F. Flanagan, Gordon Force, Lloyd Fosdick, Robert Fraley, Howard Fullmer, Daniel D. Gajski, David M. Gay, C. W. Gear, Martin Graham, David Gustavson, Guy K. Haas, Kenton Hanson, Chuck Hastings, David Hough, John E. Howe, Thomas E. Hull, Suren Irukulla, Richard, Paul S. Jensen, W. Kahan, Howard Kaikow, Richard Karpinski, Virginia Klema, Les Kohn, Dan Kuyper, M. Dundee Maples, Roy Martin, William H. McAllister, Colin McMaster, Dean Miller, Webb Miller, John C. Nash, Dan O'Dowd, Cash Olsen, A. Padegs, John F. Palmer, Beresford Parlett, Dave Patterson, Mary H. Payne, Tom Pittman, Lew Randall, Robert Reid, Christian Reinsch, Frederic N. Ris, Stan Schmidt, Van Shahan, Robert L. Smith, Roger Stafford, G. W. Stewart, Robert Stewart, Harold S. Stone, W. D. Strecker, Robert Swarz, George Taylor, James W. Thomas, Dar-Sun Tsien, Greg Walker, John S. Walther, Shlomo Waser, P. C. Waterman, Charles White
Keywords: ansi   ieee   mathematics   multiauthor   programming   standard  


This standard is a product of the Floating-Point Working Group of the Microprocessor Standards Subcommittee of the Standards Committee of the IEEE Computer Society. This work was sponsored by the Technical Committee on Microprocessors and Minicomputers. Draft 8.0 of this standard was published to solicit public comments. Implementation techniques can be found in An Implementation Guide to a Proposed Standard for Floating-Point Arithmetic by J.T. Coonen, which was based on a still earlier draft of the proposal. This standard defines a ...


Transposition invariant string matching

Journal of Algorithms, Vol. 56, No. 2. (August 2005), pp. 124-153,


Given strings and over an alphabet Σ⊆U, where U is some numerical universe closed under addition and subtraction, and a distance function d(A,B) that gives the score of the best (partial) matching of A and B, the transposition invariant distance is mint∈Ud(A+t,B), where A+t=(a1+t)(a2+t)...(am+t). We study the problem of computing the transposition invariant distance for various distance (and similarity) functions d, including Hamming distance, longest common subsequence (LCS), Levenshtein distance, and their versions where the exact matching condition is ...


Crowd-sourcing: strength in numbers

Nature, Vol. 506, No. 7489. (26 February 2014), pp. 422-423,


Researchers are finding that online, crowd-sourced collaboration can speed up their work if they choose the right problem. [Excerpt] [...] Yet this open approach has taken root as an ongoing crowd-sourcing project called Polymath. [...] Polymath 8 was a triumph for the collaborative approach, says Tao. If mathematicians had been attacking the problem in the standard way, with what he describes as “a flood of mini-papers”, it might have taken years to get the bound down that far. Polymath has not always ...


Parallel lines

Nature, Vol. 506, No. 7489. (26 February 2014), pp. 407-408,


A collaborative online mathematics project holds lessons for other disciplines. [Excerpt] Crowd-sourcing has reached mathematics, and at first glance it might seem as if this stereotypically solitary discipline is finally catching up with what other sciences have been doing for years. But, as we explore on page 422, the maths project Polymath, which invites participants to pitch in with ideas and results that might help to solve whatever problem the coordinator has set, is in some ways ahead of the curve. Not ...


Fourth class of convex equilateral polyhedron with polyhedral symmetry related to fullerenes and viruses

Proceedings of the National Academy of Sciences, Vol. 111, No. 8. (25 February 2014), pp. 2920-2925,
Keywords: mathematics   networks   topology  


[Significance] The Greeks described two classes of convex equilateral polyhedron with polyhedral symmetry, the Platonic (including the tetrahedron, octahedron, and icosahedron) and the Archimedean (including the truncated icosahedron with its soccer-ball shape). Johannes Kepler discovered a third class, the rhombic polyhedra. Some carbon fullerenes, inorganic cages, icosahedral viruses, protein complexes, and geodesic structures resemble these polyhedra. Here we add a fourth class, “Goldberg polyhedra.” Their small (corner) faces are regular 3gons, 4gons, or 5gons, whereas their planar 6gonal faces are equilateral ...


Inefficient epidemic spreading in scale-free networks

Physical Review E, Vol. 77, No. 2. (Feb 2008), 026113,


Highly heterogeneous degree distributions yield efficient spreading of simple epidemics through networks, but can be inefficient with more complex epidemiological processes. We study diseases with nonlinear force of infection whose prevalences can abruptly collapse to zero while decreasing the transmission parameters. We find that scale-free networks can be unable to support diseases that, on the contrary, are able to persist at high endemic levels in homogeneous networks with the same average degree. ...


On large scale forest fires propagation models

International Journal of Thermal Sciences, Vol. 47, No. 6. (June 2008), pp. 680-694,


The question of the modeling of forest fires at large scales is addressed. Empirical models are compared and it is shown that Rothermel's model describing the rate of spread of a straight front is included in the envelope model which in turn is included in a Hamilton–Jacobi equation description. This result shows that the preceding models could be included in reaction diffusion systems. Then an anisotropic propagation model with a nonlocal radiative term, obtained by asymptotic expansion of a combustion modeling, ...



In Free Software Directory (April 2011)
edited by Janet Casey


ATLAS (Automatically Tuned Linear Algebra Software) is a system for generating high-performance mathematical libraries. It generates a library that is specifically tuned to your processor and compiler. ATLAS's purpose is to provide portably optimal linear algebra software. In particular, ATLAS provides ANSI C and Fortran 77 interfaces to the BLAS, and a subset of LAPACK. ...



In Free Software Directory (April 2011)
edited by Kelly Hopkins


'MCSim' is a simulation and statistical inference tool for algebraic or differential equation systems. While other programs have been created to the same end, many of them are not optimal for performing computer intensive and sophisticated Monte Carlo analyses. MCSim was created specifically to perform Monte Carlo analyses in an optimized, and easy to maintain environment. ...



In Free Software Directory (October 2011), 8008


Mastrave is a free software library written to perform vectorized scientific computing and to be as compatible as possible with both GNU Octave and Matlab computing frameworks, offering general purpose, portable and freely available features for the scientific community. Mastrave is mostly oriented to ease complex modeling tasks such as those typically needed within environmental models, even when involving irregular and heterogeneous data series. [Semantic array programming]. The Mastrave project attempts to allow a more effective, quick interoperability between GNU Octave and Matlab ...



In Free Software Directory (April 2011)


MedianTracker supports efficient median queries on and dynamic additions to a list of values. It provides both the lower and upper median of all values seen so far. Any __cmp__()-able object can be tracked, in addition to numeric types. add() takes log(n) time for a tracker with n items; lower_median() and upper_median() run in constant time. Since all values must be stored, memory usage is proportional to the number of values added (O(n)). ...



In Free Software Directory (April 2013)


These calculators are real-time multi-model option chain pricers with analytics and interactive controls. optionmatrix is the GTK+ graphical user interface version and optionmatrix_console is the Curses version. Both programs feature: greeks, decimal date to real-date translations, real-date to decimal date translations, real-time time bleeding, configurable option expiration date engines, calendars, strike control systems, tickers and over 135 option models. optionmatrix also supports: spreads, bonds, term structures, cash flow editing, source code viewing and text exporting. ...



In Free Software Directory (April 2011)


PSPP is a program for statistical analysis of sampled data. It is a Free replacement for the proprietary program SPSS. ...


Emergent relation between surface vapor conductance and relative humidity profiles yields evaporation rates from weather data

Proceedings of the National Academy of Sciences, Vol. 110, No. 16. (16 April 2013), pp. 6287-6291,


The ability to predict terrestrial evapotranspiration (E) is limited by the complexity of rate-limiting pathways as water moves through the soil, vegetation (roots, xylem, stomata), canopy air space, and the atmospheric boundary layer. The impossibility of specifying the numerous parameters required to model this process in full spatial detail has necessitated spatially upscaled models that depend on effective parameters such as the surface vapor conductance (Csurf). Csurf accounts for the biophysical and hydrological effects on diffusion through the soil and vegetation ...


Variations of Non-Additive Measures

Acta Polytechnica Hungarica, Vol. 2, No. 1. (2005)
edited by Imre J. Rudas


General non-additive measures are investigated with the help of some related monotone measures (some types of variations and submeasures), which have some important additional properties. ...


Core foundations of abstract geometry

Proceedings of the National Academy of Sciences, Vol. 110, No. 35. (27 August 2013), pp. 14191-14195,


Human adults from diverse cultures share intuitions about the points, lines, and figures of Euclidean geometry. Do children develop these intuitions by drawing on phylogenetically ancient and developmentally precocious geometric representations that guide their navigation and their analysis of object shape? In what way might these early-arising representations support later-developing Euclidean intuitions? To approach these questions, we investigated the relations among young children’s use of geometry in tasks assessing: navigation; visual form analysis; and the interpretation of symbolic, purely geometric maps. ...

This page of the database may be cited as:
Integrated Natural Resources Modelling and Management - Meta-information Database.

Result page: 1 2 Next

Publication metadata

Bibtex, RIS, RSS/XML feed, Json, Dublin Core

Meta-information Database (INRMM-MiD).
This database integrates a dedicated meta-information database in CiteULike (the CiteULike INRMM Group) with the meta-information available in Google Scholar, CrossRef and DataCite. The Altmetric database with Article-Level Metrics is also harvested. Part of the provided semantic content (machine-readable) is made even human-readable thanks to the DCMI Dublin Core viewer. Digital preservation of the meta-information indexed within the INRMM-MiD publication records is implemented thanks to the Internet Archive.
The library of INRMM related pubblications may be quickly accessed with the following links.
Search within the whole INRMM meta-information database:
Search only within the INRMM-MiD publication records:
Full-text and abstracts of the publications indexed by the INRMM meta-information database are copyrighted by the respective publishers/authors. They are subject to all applicable copyright protection. The conditions of use of each indexed publication is defined by its copyright owner. Please, be aware that the indexed meta-information entirely relies on voluntary work and constitutes a quite incomplete and not homogeneous work-in-progress.
INRMM-MiD was experimentally established by the Maieutike Research Initiative in 2008 and then improved with the help of several volunteers (with a major technical upgrade in 2011). This new integrated interface is operational since 2014.