
Abstract
[Excerpt: Introduction] Hierarchical modeling is a widely used approach to building complex models by specifying a series of more simple conditional distributions. It naturally lends itself to Bayesian inference, especially using modern tools for Bayesian computation. In this chapter we first present essential concepts of hierarchical modeling, and then suggest its generality by presenting a series of widely used specific models. [...] [\n] [...] [Summary] In this chapter we have introduced hierarchical modeling as a very general approach to specifying complex models through a ...


Abstract
This is the first of two papers that use offtraining set (OTS) error to investigate the assumptionfree relationship between learning algorithms. This first paper discusses the senses in which there are no a priori distinctions between learning algorithms. (The second paper discusses the senses in which there are such distinctions.) In this first paper it is shown, loosely speaking, that for any two algorithms A and B, there are “as many” targets (or priors over targets) for which A has lower ...


Abstract
On 14 July, Maryam Mirzakhani, a luminary in pure mathematics, died of cancer at the age of 40. Her achievements had been most recently honored in 2014 by the Fields Medal, the most prestigious award in mathematics. [Excerpt] [...] There are many possible uniformly curved shapes into which the surface can be bent. These shapes are called hyperbolic metrics and exhibit the nonEuclidean geometry discovered in the 1800s after 2000 years of attempts to prove its nonexistence. The plethora of all possible ...


In NIST/SEMATECH eHandbook of Statistical Methods (2012), 7.2.6.2
Abstract
[Excerpt: Definitions of order statistics and ranks] For a series of measurements Y1, …, YN, denote the data ordered in increasing order of magnitude by Y〈1〉, …, Y〈N〉. These ordered data are called order statistics. If Y〈j〉 is the order statistic that corresponds to the measurement Yᵢ, then the rank for Yᵢ is j; i.e., [::] Y〈j〉 ∼ Yᵢ, rᵢ=j. [Definition of percentiles] Order statistics provide a way of estimating proportions of the data that should fall above and below a ...


Abstract
There are a large number of different definitions used for sample quantiles in statistical computer packages. Often within the same package one definition will be used to compute a quantile explicitly, while other definitions may be used when producing a boxplot, a probability plot, or a QQ plot. We compare the most commonly implemented sample quantile definitions by writing them in a common notation and investigating their motivation and some of their properties. We argue that there is a need to ...


Abstract
We examine the detrended fluctuation analysis (DFA), which is a wellestablished method for the detection of longrange correlations in time series. We show that deviations from scaling which appear at small time scales become stronger in higher orders of DFA, and suggest a modified DFA method to remove them. The improvement is necessary especially for short records that are affected by nonstationarities. Furthermore, we describe how crossovers in the correlation behavior can be detected reliably and determined quantitatively and show how ...


Abstract
[Significance] The question whether a seasonal climatic trend (e.g., the increase of spring temperatures in Antarctica in the last decades) is of anthropogenic or natural origin is of great importance because seasonal climatic trends may considerably affect ecological systems, agricultural yields, and human societies. Previous studies assumed that the seasonal records can be treated as independent and are characterized by shortterm memory only. Here we show that both assumptions, which may lead to a considerable overestimation of the trend significance, do not ...


Abstract
The Talmi and Gilat variational approach to the interpolation problem in arbitrary dimension is presented together with the corresponding physical model. The connection of this approach to some known spline methods is demonstrated and new interpolation functions are derived for one, two and threedimensional cases. They are designed to be flexible through the use of meaningful parameters and to give good approximations of both the function itself and its derivatives as well. ...


Abstract
An algorithm and the corresponding computer program for solution of the scattered data interpolation problem is described. Given points (xk, yk, fk), k = 1,…,N a locally defined function F(x, y) which has the property F(xk, yk) = fk, k = 1,…,N is constructed. The algorithm is based on a weighted sum of locally defined thin plate splines, and yields an interpolation function which is differentiable. The program is available from the author. ...


Abstract
An extension of the kernel method of density estimation from continuous to multivariate binary spaces is described. Its simple nonparametric nature together with its consistency properties make it an attractive tool in discrimination problems, with some advantages over already proposed parametric counterparts. The method is illustrated by an application to a particular medical diagnostic problem. Simple extensions of the method to categorical data and to data of mixed binary and continuous form are indicated. ...


Abstract
The fate of dimensions of dimensioned quantities that are inserted into the argument of transcendental functions such as logarithms, exponentiation, trigonometric, and hyperbolic functions is discussed. Emphasis is placed on common misconceptions that are not often systematically examined in undergraduate courses of physical sciences. The argument of dimensional inhomogeneity of the terms of a Taylor expansion of a transcendental function presented in some nonpeerreviewed popular Internet sites is shown to be false. ...


Abstract
A drawback in the use of the Analytic Hierarchy Process (AHP) is the effort required to complete all pairwise comparisons in large hierarchies. The Incomplete Pairwise Comparison (IPC) technique developed by Harker [1,2] aims at reducing this effort by ordering the questions in decreasing informational value and by stopping the process when added value of questions decreases below a certain level. This paper proposes further opportunities for effort reduction through globally effective elicitation process. A simple example demonstrates impressive savings in ...


Abstract
For the purpose of a product design or a project in general, weighting a set of comparable criteria has been proven to be of utmost importance (e.g. weighting product functions in value analysis  VA , and allocating a budget in a DesignToCost project). Moreover the weighting problem is related to basic properties in the field of MultiCriteria Decision Analysis (MCDA) through the notions of ordinal transitivity and rationality in the designers' mind. How should designers or project agents decide in ...


(2008)
Abstract
[Excerpt: Introduction] ePiX, a collection of batch utilities, creates mathematically accurate figures, plots, and animations containing LATEX typography. The input syntax is easy to learn, and the user interface resembles that of LATEX itself: You prepare a scene description in a text editor, then “compile” the input file into a picture. LATEX and webcompatible output types include a LATEX picturelike environment written with PSTricks, tikz, or eepic macros; vector images (eps, ps, and pdf); and bitmapped images and movies (png, mng, and gif). [\n] ePiX’s strengths include: [::] Quality of ...


Abstract
Distance covariance and distance correlation are scalar coefficients that characterize independence of random vectors in arbitrary dimension. Properties, extensions and applications of distance correlation have been discussed in the recent literature, but the problem of defining the partial distance correlation has remained an open question of considerable interest. The problem of partial distance correlation is more complex than partial correlation partly because the squared distance covariance is not an inner product in the usual linear space. For the definition of partial ...


Abstract
Energy distance is a metric that measures the distance between the distributions of random vectors. Energy distance is zero if and only if the distributions are identical, thus it characterizes equality of distributions and provides a theoretical foundation for statistical inference and analysis. Energy statistics are functions of distances between observations in metric spaces. As a statistic, energy distance can be applied to measure the difference between a sample and a hypothesized distribution or the difference between two or more samples ...


In Free Software Directory (July 2009)
Abstract
An optimal interpolation toolbox for Octave. This package provides functions to perform a ndimensional optimal interpolations of arbitrarily distributed data points ...


(February 2014)
Abstract
List of indexed keywords within the transdisciplinary set of domains which relate to the Integrated Natural Resources Modelling and Management (INRMM). In particular, the list of keywords maps the semantic tags in the INRMM Metainformation Database (INRMMMiD). [\n] The INRMMMiD records providing this list are accessible by the special tag: inrmmlistoftags ( http://mfkp.org/INRMM/tag/inrmmlistoftags ). ...


Abstract
[Excerpt] Two editorial mistakes were found in the article. Both refer to Eq. (2), p. 231 (whose correct version was published in the discussion paper, p. 2652). [\n] The first mistake is related to the operator Ω, which was wrongly rendered with a summation operator (Σ). The editorial notation mistake is also evident by considering the semantics of the RDS (relative distance similarity) statistics. As explained in de Rigo et al. (2013) and Bosco et al. (2013), RDS is defined in [0, 1]. Therefore, a summation operator whose arguments are quantities ...


Abstract
[Highlights] [::] Moving window analysis is a prominent means of analyzing the spatial variability of landscape patterns at multiple scales. [::] A new computational framework is presented that overcomes technical and computational barriers to the use and implementation of moving windows based landscape pattern analysis of raster maps. [::] For a small window of 41 × 41 pixels, computation time was reduced by a factor 600 compared to the most commonly used software. These gains will be greater for larger windows. [::] The framework facilitates ...


Complex Systems, Vol. 14, No. 3. (2003), pp. 269274
Abstract
We report a method of estimating what percentage of people who cited a paper had actually read it. The method is based on a stochastic modeling of the citation process that explains empirical studies of misprint distributions in citations (which we show follows a Zipf law). Our estimate is only about 20% of citers read the original. ...


Abstract
A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map  for instance, the space of all possible fivepixel products in 16 x 16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition. A new method for performing a nonlinear form ...


Abstract
Nonlinear principal component analysis (NLPCA) as a nonlinear generalisation of standard principal component analysis (PCA) means to generalise the principal components from straight lines to curves. This chapter aims to provide an extensive description of the autoassociative neural network approach for NLPCA. Several network architectures will be discussed including the hierarchical, the circular, and the inverse model with special emphasis to missing data. Results are shown from applications in the field of molecular biology. This includes metabolite data analysis of a ...


In Proceedings of the 10th European Symposium on Artificial Neural Networks (ESANN) (2002), pp. 439444


Abstract
Distance correlation is a new measure of dependence between random vectors. Distance covariance and distance correlation are analogous to productmoment covariance and correlation, but unlike the classical definition of correlation, distance correlation is zero only if the random vectors are independent. The empirical distance dependence measures are based on certain Euclidean distances between sample elements rather than sample moments, yet have a compact representation analogous to the classical covariance and correlation. Asymptotic properties and applications in testing independence are discussed. Implementation of the test and Monte Carlo results are ...


Abstract
Distance correlation is a new class of multivariate dependence coefficients applicable to random vectors of arbitrary and not necessarily equal dimension. Distance covariance and distance correlation are analogous to productmoment covariance and correlation, but generalize and extend these classical bivariate measures of dependence. Distance correlation characterizes independence: it is zero if and only if the random vectors are independent. The notion of covariance with respect to a stochastic process is introduced, and it is shown that population distance covariance coincides with the covariance with respect to Brownian motion; thus, ...


Abstract
Rejoinder to "Brownian distance covariance" by Gábor J. Székely and Maria L. Rizzo [arXiv:1010.0297] ...


Abstract
Sixty years ago, it was observed that any independent and identically distributed (i.i.d.) random variable would produce a pattern of peaktopeak sequences with, on average, three events per sequence. This outcome was employed to show that randomness could yield, as a null hypothesis for animal populations, an explanation for their apparent 3year cycles. We show how we can explicitly obtain a universal distribution of the lengths of peaktopeak sequences in time series and that this can be employed for long data ...


INFORMIT, Vol. 2014 (2014), 2213858
Abstract
To celebrate the publication of the eBooks of The Art of Computer Programming, (TAOCP), we asked several computer scientists, contemporaries, colleagues, and wellwishers to pose one question each to author Donald E. Knuth. Here are his answers. ...


Abstract
[Abstract] Cluster analysis is aimed at classifying elements into categories on the basis of their similarity. Its applications range from astronomy to bioinformatics, bibliometrics, and pattern recognition. We propose an approach based on the idea that cluster centers are characterized by a higher density than their neighbors and by a relatively large distance from points with higher densities. This idea forms the basis of a clustering procedure in which the number of clusters arises intuitively, outliers are automatically spotted and excluded ...


by Dan Zuras, Mike Cowlishaw, Alex Aiken, Matthew Applegate, David Bailey, Steve Bass, Dileep Bhandarkar, Mahesh Bhat, David Bindel, Sylvie Boldo, Stephen Canon, Steven R. Carlough, Marius Cornea, Mike Cowlishaw, John H. Crawford, Joseph D. Darcy, Debjit Das Sarma, Marc Daumas, Bob Davis, Mark Davis, Dick Delp, Jim Demmel, Mark A. Erle, Hossam A. H. Fahmy, J. P. Fasano, Richard Fateman, Eric Feng, Warren E. Ferguson, Alex FitFlorea, Laurent Fournier, Chip Freitag, Ivan Godard, Roger A. Golliver, David Gustafson, Michel Hack, John R. Harrison, John Hauser, Yozo Hida, Chris N. Hinds, Graydon Hoare, David G. Hough, Jerry Huck, Jim Hull, Michael Ingrassia, David V. James, Rick James, William Kahan, John Kapernick, Richard Karpinski, Jeff Kidder, Plamen Koev, RenCang Li, Zhishun A. Liu, Raymond Mak, Peter Markstein, David Matula, Guillaume Melquiond, Nobuyoshi Mori, Ricardo Morin, Ned Nedialkov, Craig Nelson, Stuart Oberman, Jon Okada, Ian Ollmann, Michael Parks, Tom Pittman, Eric Postpischil, Jason Riedy, Eric M. Schwarz, David Scott, Don Senzig, Ilya Sharapov, Jim Shearer, Michael Siu, Ron Smith, Chuck Stevens, Peter Tang, Pamela J. Taylor, James W. Thomas, Brandon Thompson, Wendy Thrash, Neil Toda, Son D. Trong, Leonard Tsai, Charles Tsen, Fred Tydeman, Liang K. Wang, Scott Westbrook, Steve Winkler, Anthony Wood, Umit Yalcinalp, Fred Zemke, Paul Zimmermann
Abstract
This standard specifies interchange and arithmetic formats and methods for binary and decimal floatingpoint arithmetic in computer programming environments. This standard specifies exception conditions and their default handling. An implementation of a floatingpoint system conforming to this standard may be realized entirely in software, entirely in hardware, or in any combination of software and hardware. For operations specified in the normative part of this standard, numerical results and exceptions are uniquely determined by the values of the input data, sequence of ...


by John E. May, John P. Riganati, Sava I. Sherr, James H. Beall, Fletcher J. Buckley, Rene Castenschiold, Edward Chelotti, Edward J. Cohen, Paul G. Cummings, Donald C. Fleckenstein, Jay Forster, Daniel L. Goldberg, Kenneth D. Hendrix, Irvin N. Howell, Jack Kinn, Joseph L. Koepfinger, Irving Kolodny, R. F. Lawrence, Lawrence V. McCall, Donald T. Michael, Frank L. Rose, Clifford O. Swanson, J. Richard Weger, W. B. Wilkens, Charles J. Wylie, Andrew Allison, William Ames, Mike Arya, Janis Baron, Steve Baumel, Dileep Bhandarkar, Joel Boney, E. H. Bristol, Werner Buchholz, Jim Bunch, Ed Burdick, Gary R. Burke, Paul Clemente, W. J. Cody, Jerome T. Coonen, Jim Crapuchettes, Itzhak Davidesko, Wayne Davison, R. H. Delp, James Demmel, Donn Denman, Alvin Despain, Augustin A. Dubrulle, Tom Eggers, Philip J. Faillace, Richard Fateman, David Feign, Don Feinberg, Smart Feldman, Eugene Fisher, Paul F. Flanagan, Gordon Force, Lloyd Fosdick, Robert Fraley, Howard Fullmer, Daniel D. Gajski, David M. Gay, C. W. Gear, Martin Graham, David Gustavson, Guy K. Haas, Kenton Hanson, Chuck Hastings, David Hough, John E. Howe, Thomas E. Hull, Suren Irukulla, Richard, Paul S. Jensen, W. Kahan, Howard Kaikow, Richard Karpinski, Virginia Klema, Les Kohn, Dan Kuyper, M. Dundee Maples, Roy Martin, William H. McAllister, Colin McMaster, Dean Miller, Webb Miller, John C. Nash, Dan O'Dowd, Cash Olsen, A. Padegs, John F. Palmer, Beresford Parlett, Dave Patterson, Mary H. Payne, Tom Pittman, Lew Randall, Robert Reid, Christian Reinsch, Frederic N. Ris, Stan Schmidt, Van Shahan, Robert L. Smith, Roger Stafford, G. W. Stewart, Robert Stewart, Harold S. Stone, W. D. Strecker, Robert Swarz, George Taylor, James W. Thomas, DarSun Tsien, Greg Walker, John S. Walther, Shlomo Waser, P. C. Waterman, Charles White
Abstract
This standard is a product of the FloatingPoint Working Group of the Microprocessor Standards Subcommittee of the Standards Committee of the IEEE Computer Society. This work was sponsored by the Technical Committee on Microprocessors and Minicomputers. Draft 8.0 of this standard was published to solicit public comments. Implementation techniques can be found in An Implementation Guide to a Proposed Standard for FloatingPoint Arithmetic by J.T. Coonen, which was based on a still earlier draft of the proposal. This standard defines a ...


Abstract
Given strings A=a1a2...am and B=b1b2...bn over an alphabet Σ⊆U, where U is some numerical universe closed under addition and subtraction, and a distance function d(A,B) that gives the score of the best (partial) matching of A and B, the transposition invariant distance is mint∈Ud(A+t,B), where A+t=(a1+t)(a2+t)...(am+t). We study the problem of computing the transposition invariant distance for various distance (and similarity) functions d, including Hamming distance, longest common subsequence (LCS), Levenshtein distance, and their versions where the exact matching condition is ...


Abstract
Researchers are finding that online, crowdsourced collaboration can speed up their work if they choose the right problem. [Excerpt] [...] Yet this open approach has taken root as an ongoing crowdsourcing project called Polymath. [...] Polymath 8 was a triumph for the collaborative approach, says Tao. If mathematicians had been attacking the problem in the standard way, with what he describes as “a flood of minipapers”, it might have taken years to get the bound down that far. Polymath has not always ...


Abstract
A collaborative online mathematics project holds lessons for other disciplines. [Excerpt] Crowdsourcing has reached mathematics, and at first glance it might seem as if this stereotypically solitary discipline is finally catching up with what other sciences have been doing for years. But, as we explore on page 422, the maths project Polymath, which invites participants to pitch in with ideas and results that might help to solve whatever problem the coordinator has set, is in some ways ahead of the curve. Not ...


Abstract
[Significance] The Greeks described two classes of convex equilateral polyhedron with polyhedral symmetry, the Platonic (including the tetrahedron, octahedron, and icosahedron) and the Archimedean (including the truncated icosahedron with its soccerball shape). Johannes Kepler discovered a third class, the rhombic polyhedra. Some carbon fullerenes, inorganic cages, icosahedral viruses, protein complexes, and geodesic structures resemble these polyhedra. Here we add a fourth class, “Goldberg polyhedra.” Their small (corner) faces are regular 3gons, 4gons, or 5gons, whereas their planar 6gonal faces are equilateral ...


Abstract
Highly heterogeneous degree distributions yield efficient spreading of simple epidemics through networks, but can be inefficient with more complex epidemiological processes. We study diseases with nonlinear force of infection whose prevalences can abruptly collapse to zero while decreasing the transmission parameters. We find that scalefree networks can be unable to support diseases that, on the contrary, are able to persist at high endemic levels in homogeneous networks with the same average degree. ...


Abstract
The question of the modeling of forest fires at large scales is addressed. Empirical models are compared and it is shown that Rothermel's model describing the rate of spread of a straight front is included in the envelope model which in turn is included in a Hamilton–Jacobi equation description. This result shows that the preceding models could be included in reaction diffusion systems. Then an anisotropic propagation model with a nonlocal radiative term, obtained by asymptotic expansion of a combustion modeling, ...


In Free Software Directory (April 2011)
Abstract
ATLAS (Automatically Tuned Linear Algebra Software) is a system for generating highperformance mathematical libraries. It generates a library that is specifically tuned to your processor and compiler. ATLAS's purpose is to provide portably optimal linear algebra software. In particular, ATLAS provides ANSI C and Fortran 77 interfaces to the BLAS, and a subset of LAPACK. ...


In Free Software Directory (April 2011)
Abstract
'MCSim' is a simulation and statistical inference tool for algebraic or differential equation systems. While other programs have been created to the same end, many of them are not optimal for performing computer intensive and sophisticated Monte Carlo analyses. MCSim was created specifically to perform Monte Carlo analyses in an optimized, and easy to maintain environment. ...


In Free Software Directory (October 2011), 8008
Abstract
Mastrave is a free software library written to perform vectorized scientific computing and to be as compatible as possible with both GNU Octave and Matlab computing frameworks, offering general purpose, portable and freely available features for the scientific community. Mastrave is mostly oriented to ease complex modeling tasks such as those typically needed within environmental models, even when involving irregular and heterogeneous data series. [Semantic array programming]. The Mastrave project attempts to allow a more effective, quick interoperability between GNU Octave and Matlab ...


In Free Software Directory (April 2011)
Abstract
MedianTracker supports efficient median queries on and dynamic additions to a list of values. It provides both the lower and upper median of all values seen so far. Any __cmp__()able object can be tracked, in addition to numeric types. add() takes log(n) time for a tracker with n items; lower_median() and upper_median() run in constant time. Since all values must be stored, memory usage is proportional to the number of values added (O(n)). ...


In Free Software Directory (April 2013)
Abstract
These calculators are realtime multimodel option chain pricers with analytics and interactive controls. optionmatrix is the GTK+ graphical user interface version and optionmatrix_console is the Curses version. Both programs feature: greeks, decimal date to realdate translations, realdate to decimal date translations, realtime time bleeding, configurable option expiration date engines, calendars, strike control systems, tickers and over 135 option models. optionmatrix also supports: spreads, bonds, term structures, cash flow editing, source code viewing and text exporting. ...


In Free Software Directory (April 2011)
Abstract
PSPP is a program for statistical analysis of sampled data. It is a Free replacement for the proprietary program SPSS. ...


Abstract
The ability to predict terrestrial evapotranspiration (E) is limited by the complexity of ratelimiting pathways as water moves through the soil, vegetation (roots, xylem, stomata), canopy air space, and the atmospheric boundary layer. The impossibility of specifying the numerous parameters required to model this process in full spatial detail has necessitated spatially upscaled models that depend on effective parameters such as the surface vapor conductance (Csurf). Csurf accounts for the biophysical and hydrological effects on diffusion through the soil and vegetation ...


Acta Polytechnica Hungarica, Vol. 2, No. 1. (2005)
Abstract
General nonadditive measures are investigated with the help of some related monotone measures (some types of variations and submeasures), which have some important additional properties. ...


Abstract
Human adults from diverse cultures share intuitions about the points, lines, and figures of Euclidean geometry. Do children develop these intuitions by drawing on phylogenetically ancient and developmentally precocious geometric representations that guide their navigation and their analysis of object shape? In what way might these earlyarising representations support laterdeveloping Euclidean intuitions? To approach these questions, we investigated the relations among young children’s use of geometry in tasks assessing: navigation; visual form analysis; and the interpretation of symbolic, purely geometric maps. ...


Abstract
It is presented an approach to decision theory based upon nonprobabilistic uncertainty. There is an axiomatization of the hybrid probabilistic possibilistic mixtures based on a pair of triangular conorm and triangular norm satisfying restricted distributivity law, and the corresponding nonadditive Smeasure. This is characterized by the families of operations involved in generalized mixtures, based upon a previous result on the characterization of the pair of continuous tnorm and tconorm such that the former is restrictedly distributive over the latter. The obtained ...


Abstract
Recent studies show that in interdependent networks a very small failure in one network may lead to catastrophic consequences. Above a critical fraction of interdependent nodes, even a single node failure can invoke cascading failures that may abruptly fragment the system, whereas below this critical dependency a failure of a few nodes leads only to a small amount of damage to the system. So far, research has focused on interdependent random networks without space limitations. However, many real systems, such as ...


Abstract
Measuring modularity is important to understand the structure of networks, and has an important number of realworld implications. However, several measures exists to assess the modularity, and give both different modularity values and different modules composition. In this article, I propose an a posteriori measure of modularity, which represents the ratio of interactions between members of the same modules vs. members of different modules. I apply this measure to a large dataset of 290 ecological networks, to show that it gives ...
