z componentization By looselycoupled.com Published On :: 2004-09-28T15:00:00-00:00 Breaking down into interchangeable pieces. For many years, software innovators have been trying to make software more like computer hardware, which is assembled from cheap, mass-produced components that connect together using standard interfaces. Component-based development (CBD) uses this approach to assemble software from reusable components within frameworks such as CORBA, Sun's Enterprise Java Beans (EJBs) and Microsoft COM. Today's service oriented architectures, based on web services, go a step further by encapsulating components in a standards-based service interface, which allows components to be reused outside their native framework. Componentization is not limited to software; through the use of subcontracting and outsourcing, it can also apply to business organizations and processes. Full Article
z Correction: Sensitivity analysis for an unobserved moderator in RCT-to-target-population generalization of treatment effects By projecteuclid.org Published On :: Wed, 15 Apr 2020 22:05 EDT Trang Quynh Nguyen, Elizabeth A. Stuart. Source: The Annals of Applied Statistics, Volume 14, Number 1, 518--520. Full Article
z Bayesian mixed effects models for zero-inflated compositions in microbiome data analysis By projecteuclid.org Published On :: Wed, 15 Apr 2020 22:05 EDT Boyu Ren, Sergio Bacallado, Stefano Favaro, Tommi Vatanen, Curtis Huttenhower, Lorenzo Trippa. Source: The Annals of Applied Statistics, Volume 14, Number 1, 494--517.Abstract: Detecting associations between microbial compositions and sample characteristics is one of the most important tasks in microbiome studies. Most of the existing methods apply univariate models to single microbial species separately, with adjustments for multiple hypothesis testing. We propose a Bayesian analysis for a generalized mixed effects linear model tailored to this application. The marginal prior on each microbial composition is a Dirichlet process, and dependence across compositions is induced through a linear combination of individual covariates, such as disease biomarkers or the subject’s age, and latent factors. The latent factors capture residual variability and their dimensionality is learned from the data in a fully Bayesian procedure. The proposed model is tested in data analyses and simulation studies with zero-inflated compositions. In these settings and within each sample, a large proportion of counts per microbial species are equal to zero. In our Bayesian model a priori the probability of compositions with absent microbial species is strictly positive. We propose an efficient algorithm to sample from the posterior and visualizations of model parameters which reveal associations between covariates and microbial compositions. We evaluate the proposed method in simulation studies, and then analyze a microbiome dataset for infants with type 1 diabetes which contains a large proportion of zeros in the sample-specific microbial compositions. Full Article
z Feature selection for generalized varying coefficient mixed-effect models with application to obesity GWAS By projecteuclid.org Published On :: Wed, 15 Apr 2020 22:05 EDT Wanghuan Chu, Runze Li, Jingyuan Liu, Matthew Reimherr. Source: The Annals of Applied Statistics, Volume 14, Number 1, 276--298.Abstract: Motivated by an empirical analysis of data from a genome-wide association study on obesity, measured by the body mass index (BMI), we propose a two-step gene-detection procedure for generalized varying coefficient mixed-effects models with ultrahigh dimensional covariates. The proposed procedure selects significant single nucleotide polymorphisms (SNPs) impacting the mean BMI trend, some of which have already been biologically proven to be “fat genes.” The method also discovers SNPs that significantly influence the age-dependent variability of BMI. The proposed procedure takes into account individual variations of genetic effects and can also be directly applied to longitudinal data with continuous, binary or count responses. We employ Monte Carlo simulation studies to assess the performance of the proposed method and further carry out causal inference for the selected SNPs. Full Article
z Efficient real-time monitoring of an emerging influenza pandemic: How feasible? By projecteuclid.org Published On :: Wed, 15 Apr 2020 22:05 EDT Paul J. Birrell, Lorenz Wernisch, Brian D. M. Tom, Leonhard Held, Gareth O. Roberts, Richard G. Pebody, Daniela De Angelis. Source: The Annals of Applied Statistics, Volume 14, Number 1, 74--93.Abstract: A prompt public health response to a new epidemic relies on the ability to monitor and predict its evolution in real time as data accumulate. The 2009 A/H1N1 outbreak in the UK revealed pandemic data as noisy, contaminated, potentially biased and originating from multiple sources. This seriously challenges the capacity for real-time monitoring. Here, we assess the feasibility of real-time inference based on such data by constructing an analytic tool combining an age-stratified SEIR transmission model with various observation models describing the data generation mechanisms. As batches of data become available, a sequential Monte Carlo (SMC) algorithm is developed to synthesise multiple imperfect data streams, iterate epidemic inferences and assess model adequacy amidst a rapidly evolving epidemic environment, substantially reducing computation time in comparison to standard MCMC, to ensure timely delivery of real-time epidemic assessments. In application to simulated data designed to mimic the 2009 A/H1N1 epidemic, SMC is shown to have additional benefits in terms of assessing predictive performance and coping with parameter nonidentifiability. Full Article
z New formulation of the logistic-Gaussian process to analyze trajectory tracking data By projecteuclid.org Published On :: Wed, 27 Nov 2019 22:01 EST Gianluca Mastrantonio, Clara Grazian, Sara Mancinelli, Enrico Bibbona. Source: The Annals of Applied Statistics, Volume 13, Number 4, 2483--2508.Abstract: Improved communication systems, shrinking battery sizes and the price drop of tracking devices have led to an increasing availability of trajectory tracking data. These data are often analyzed to understand animal behavior. In this work, we propose a new model for interpreting the animal movent as a mixture of characteristic patterns, that we interpret as different behaviors. The probability that the animal is behaving according to a specific pattern, at each time instant, is nonparametrically estimated using the Logistic-Gaussian process. Owing to a new formalization and the way we specify the coregionalization matrix of the associated multivariate Gaussian process, our model is invariant with respect to the choice of the reference element and of the ordering of the probability vector components. We fit the model under a Bayesian framework, and show that the Markov chain Monte Carlo algorithm we propose is straightforward to implement. We perform a simulation study with the aim of showing the ability of the estimation procedure to retrieve the model parameters. We also test the performance of the information criterion we used to select the number of behaviors. The model is then applied to a real dataset where a wolf has been observed before and after procreation. The results are easy to interpret, and clear differences emerge in the two phases. Full Article
z Outline analyses of the called strike zone in Major League Baseball By projecteuclid.org Published On :: Wed, 27 Nov 2019 22:01 EST Dale L. Zimmerman, Jun Tang, Rui Huang. Source: The Annals of Applied Statistics, Volume 13, Number 4, 2416--2451.Abstract: We extend statistical shape analytic methods known as outline analysis for application to the strike zone, a central feature of the game of baseball. Although the strike zone is rigorously defined by Major League Baseball’s official rules, umpires make mistakes in calling pitches as strikes (and balls) and may even adhere to a strike zone somewhat different than that prescribed by the rule book. Our methods yield inference on geometric attributes (centroid, dimensions, orientation and shape) of this “called strike zone” (CSZ) and on the effects that years, umpires, player attributes, game situation factors and their interactions have on those attributes. The methodology consists of first using kernel discriminant analysis to determine a noisy outline representing the CSZ corresponding to each factor combination, then fitting existing elliptic Fourier and new generalized superelliptic models for closed curves to that outline and finally analyzing the fitted model coefficients using standard methods of regression analysis, factorial analysis of variance and variance component estimation. We apply these methods to PITCHf/x data comprising more than three million called pitches from the 2008–2016 Major League Baseball seasons to address numerous questions about the CSZ. We find that all geometric attributes of the CSZ, except its size, became significantly more like those of the rule-book strike zone from 2008–2016 and that several player attribute/game situation factors had statistically and practically significant effects on many of them. We also establish that the variation in the horizontal center, width and area of an individual umpire’s CSZ from pitch to pitch is smaller than their variation among CSZs from different umpires. Full Article
z Bayesian modeling of the structural connectome for studying Alzheimer’s disease By projecteuclid.org Published On :: Wed, 16 Oct 2019 22:03 EDT Arkaprava Roy, Subhashis Ghosal, Jeffrey Prescott, Kingshuk Roy Choudhury. Source: The Annals of Applied Statistics, Volume 13, Number 3, 1791--1816.Abstract: We study possible relations between Alzheimer’s disease progression and the structure of the connectome which is white matter connecting different regions of the brain. Regression models in covariates including age, gender and disease status for the extent of white matter connecting each pair of regions of the brain are proposed. Subject inhomogeneity is also incorporated in the model through random effects with an unknown distribution. As there is a large number of pairs of regions, we also adopt a dimension reduction technique through graphon ( J. Combin. Theory Ser. B 96 (2006) 933–957) functions which reduces the functions of pairs of regions to functions of regions. The connecting graphon functions are considered unknown but the assumed smoothness allows putting priors of low complexity on these functions. We pursue a nonparametric Bayesian approach by assigning a Dirichlet process scale mixture of zero to mean normal prior on the distributions of the random effects and finite random series of tensor products of B-splines priors on the underlying graphon functions. We develop efficient Markov chain Monte Carlo techniques for drawing samples for the posterior distributions using Hamiltonian Monte Carlo (HMC). The proposed Bayesian method overwhelmingly outperforms a competing method based on ANCOVA models in the simulation setup. The proposed Bayesian approach is applied on a dataset of 100 subjects and 83 brain regions and key regions implicated in the changing connectome are identified. Full Article
z RCRnorm: An integrated system of random-coefficient hierarchical regression models for normalizing NanoString nCounter data By projecteuclid.org Published On :: Wed, 16 Oct 2019 22:03 EDT Gaoxiang Jia, Xinlei Wang, Qiwei Li, Wei Lu, Ximing Tang, Ignacio Wistuba, Yang Xie. Source: The Annals of Applied Statistics, Volume 13, Number 3, 1617--1647.Abstract: Formalin-fixed paraffin-embedded (FFPE) samples have great potential for biomarker discovery, retrospective studies and diagnosis or prognosis of diseases. Their application, however, is hindered by the unsatisfactory performance of traditional gene expression profiling techniques on damaged RNAs. NanoString nCounter platform is well suited for profiling of FFPE samples and measures gene expression with high sensitivity which may greatly facilitate realization of scientific and clinical values of FFPE samples. However, methodological development for normalization, a critical step when analyzing this type of data, is far behind. Existing methods designed for the platform use information from different types of internal controls separately and rely on an overly-simplified assumption that expression of housekeeping genes is constant across samples for global scaling. Thus, these methods are not optimized for the nCounter system, not mentioning that they were not developed for FFPE samples. We construct an integrated system of random-coefficient hierarchical regression models to capture main patterns and characteristics observed from NanoString data of FFPE samples and develop a Bayesian approach to estimate parameters and normalize gene expression across samples. Our method, labeled RCRnorm, incorporates information from all aspects of the experimental design and simultaneously removes biases from various sources. It eliminates the unrealistic assumption on housekeeping genes and offers great interpretability. Furthermore, it is applicable to freshly frozen or like samples that can be generally viewed as a reduced case of FFPE samples. Simulation and applications showed the superior performance of RCRnorm. Full Article
z A hidden Markov model approach to characterizing the photo-switching behavior of fluorophores By projecteuclid.org Published On :: Wed, 16 Oct 2019 22:03 EDT Lekha Patel, Nils Gustafsson, Yu Lin, Raimund Ober, Ricardo Henriques, Edward Cohen. Source: The Annals of Applied Statistics, Volume 13, Number 3, 1397--1429.Abstract: Fluorescing molecules (fluorophores) that stochastically switch between photon-emitting and dark states underpin some of the most celebrated advancements in super-resolution microscopy. While this stochastic behavior has been heavily exploited, full characterization of the underlying models can potentially drive forward further imaging methodologies. Under the assumption that fluorophores move between fluorescing and dark states as continuous time Markov processes, the goal is to use a sequence of images to select a model and estimate the transition rates. We use a hidden Markov model to relate the observed discrete time signal to the hidden continuous time process. With imaging involving several repeat exposures of the fluorophore, we show the observed signal depends on both the current and past states of the hidden process, producing emission probabilities that depend on the transition rate parameters to be estimated. To tackle this unusual coupling of the transition and emission probabilities, we conceive transmission (transition-emission) matrices that capture all dependencies of the model. We provide a scheme of computing these matrices and adapt the forward-backward algorithm to compute a likelihood which is readily optimized to provide rate estimates. When confronted with several model proposals, combining this procedure with the Bayesian Information Criterion provides accurate model selection. Full Article
z A characterization of the finiteness of perpetual integrals of Lévy processes By projecteuclid.org Published On :: Fri, 31 Jan 2020 04:06 EST Martin Kolb, Mladen Savov. Source: Bernoulli, Volume 26, Number 2, 1453--1472.Abstract: We derive a criterium for the almost sure finiteness of perpetual integrals of Lévy processes for a class of real functions including all continuous functions and for general one-dimensional Lévy processes that drifts to plus infinity. This generalizes previous work of Döring and Kyprianou, who considered Lévy processes having a local time, leaving the general case as an open problem. It turns out, that the criterium in the general situation simplifies significantly in the situation, where the process has a local time, but we also demonstrate that in general our criterium can not be reduced. This answers an open problem posed in ( J. Theoret. Probab. 29 (2016) 1192–1198). Full Article
z Strictly weak consensus in the uniform compass model on $mathbb{Z}$ By projecteuclid.org Published On :: Fri, 31 Jan 2020 04:06 EST Nina Gantert, Markus Heydenreich, Timo Hirscher. Source: Bernoulli, Volume 26, Number 2, 1269--1293.Abstract: We investigate a model for opinion dynamics, where individuals (modeled by vertices of a graph) hold certain abstract opinions. As time progresses, neighboring individuals interact with each other, and this interaction results in a realignment of opinions closer towards each other. This mechanism triggers formation of consensus among the individuals. Our main focus is on strong consensus (i.e., global agreement of all individuals) versus weak consensus (i.e., local agreement among neighbors). By extending a known model to a more general opinion space, which lacks a “central” opinion acting as a contraction point, we provide an example of an opinion formation process on the one-dimensional lattice $mathbb{Z}$ with weak consensus but no strong consensus. Full Article
z Characterization of probability distribution convergence in Wasserstein distance by $L^{p}$-quantization error function By projecteuclid.org Published On :: Fri, 31 Jan 2020 04:06 EST Yating Liu, Gilles Pagès. Source: Bernoulli, Volume 26, Number 2, 1171--1204.Abstract: We establish conditions to characterize probability measures by their $L^{p}$-quantization error functions in both $mathbb{R}^{d}$ and Hilbert settings. This characterization is two-fold: static (identity of two distributions) and dynamic (convergence for the $L^{p}$-Wasserstein distance). We first propose a criterion on the quantization level $N$, valid for any norm on $mathbb{R}^{d}$ and any order $p$ based on a geometrical approach involving the Voronoï diagram. Then, we prove that in the $L^{2}$-case on a (separable) Hilbert space, the condition on the level $N$ can be reduced to $N=2$, which is optimal. More quantization based characterization cases in dimension 1 and a discussion of the completeness of a distance defined by the quantization error function can be found at the end of this paper. Full Article
z Prediction and estimation consistency of sparse multi-class penalized optimal scoring By projecteuclid.org Published On :: Tue, 26 Nov 2019 04:00 EST Irina Gaynanova. Source: Bernoulli, Volume 26, Number 1, 286--322.Abstract: Sparse linear discriminant analysis via penalized optimal scoring is a successful tool for classification in high-dimensional settings. While the variable selection consistency of sparse optimal scoring has been established, the corresponding prediction and estimation consistency results have been lacking. We bridge this gap by providing probabilistic bounds on out-of-sample prediction error and estimation error of multi-class penalized optimal scoring allowing for diverging number of classes. Full Article
z The Thomson family : fisherman in Buckhaven, retailers in Kapunda / compiled by Elizabeth Anne Howell. By www.catalog.slsa.sa.gov.au Published On :: Thomson (Family) Full Article
z From Westphalia to South Australia : the story of Franz Heinrich Ernst Siekmann / by Peter Brinkworth. By www.catalog.slsa.sa.gov.au Published On :: Siekmann, Francis Heinrich Ernst, 1830-1917. Full Article
z A family history Siglin to Siegele 1530 to 2019 : from Ditzingen, Germany over land and sea / Ian G. Siegele. By www.catalog.slsa.sa.gov.au Published On :: Germans -- South Australia. Full Article
z Slow tain to Auschwitz : memoirs of a life in war and peace / Peter Kraus. By www.catalog.slsa.sa.gov.au Published On :: Kraus, Peter -- Biography. Full Article
z From Wends we came : the story of Johann and Maria Huppatz & their descendants / compiled by Frank Huppatz and Rone McDonnell. By www.catalog.slsa.sa.gov.au Published On :: Huppatz (Family). Full Article
z Art Around the Library - Zine to Artist's Book By feedproxy.google.com Published On :: Mon, 04 May 2020 01:43:33 +0000 Find out how easy it is to make a ‘zine’ and you’re well on your way to producing your own mini books. Full Article
z Art Around the Library - Zentangles By feedproxy.google.com Published On :: Mon, 04 May 2020 01:58:58 +0000 Discover the enjoyment of a meandering line with decoration and zentangle your way into the holidays! Full Article
z Anarchy in Venezuela's jails laid bare by massacre over food By news.yahoo.com Published On :: Fri, 08 May 2020 13:27:04 -0400 Three weeks before he was shot dead, Miguel Calderon, an inmate in the lawless Los Llanos jail on Venezuela's central plains, sent a voice message to his father. Like many of the prisoners in Venezuela's overcrowded and violent penitentiaries, Los Llanos's 4,000 inmates normally subsist on food relatives bring them. The guards, desperate themselves amid national shortages, began stealing the little food getting behind bars, inmates said, forcing some prisoners to turn to eating stray animals. Full Article
z Chaffetz: I don't understand why Adam Schiff continues to have a security clearance By news.yahoo.com Published On :: Fri, 08 May 2020 14:43:30 -0400 Fox News contributor Jason Chaffetz and Andy McCarthy react to House Intelligence transcripts on Russia probe. Full Article
z New Zealand says it backs Taiwan's role in WHO due to success with coronavirus By news.yahoo.com Published On :: Thu, 07 May 2020 23:20:43 -0400 Full Article
z Cruz gets his hair cut at salon whose owner was jailed for defying Texas coronavirus restrictions By news.yahoo.com Published On :: Fri, 08 May 2020 19:28:43 -0400 After his haircut, Sen. Ted Cruz said, "It was ridiculous to see somebody sentenced to seven days in jail for cutting hair." Full Article
z Brazil's Amazon: Surge in deforestation as military prepares to deploy By news.yahoo.com Published On :: Fri, 08 May 2020 17:17:52 -0400 The military is preparing to deploy to the region to try to stop illegal logging and mining. Full Article
z Hierarchical Normalized Completely Random Measures for Robust Graphical Modeling By projecteuclid.org Published On :: Thu, 19 Dec 2019 22:10 EST Andrea Cremaschi, Raffaele Argiento, Katherine Shoemaker, Christine Peterson, Marina Vannucci. Source: Bayesian Analysis, Volume 14, Number 4, 1271--1301.Abstract: Gaussian graphical models are useful tools for exploring network structures in multivariate normal data. In this paper we are interested in situations where data show departures from Gaussianity, therefore requiring alternative modeling distributions. The multivariate $t$ -distribution, obtained by dividing each component of the data vector by a gamma random variable, is a straightforward generalization to accommodate deviations from normality such as heavy tails. Since different groups of variables may be contaminated to a different extent, Finegold and Drton (2014) introduced the Dirichlet $t$ -distribution, where the divisors are clustered using a Dirichlet process. In this work, we consider a more general class of nonparametric distributions as the prior on the divisor terms, namely the class of normalized completely random measures (NormCRMs). To improve the effectiveness of the clustering, we propose modeling the dependence among the divisors through a nonparametric hierarchical structure, which allows for the sharing of parameters across the samples in the data set. This desirable feature enables us to cluster together different components of multivariate data in a parsimonious way. We demonstrate through simulations that this approach provides accurate graphical model inference, and apply it to a case study examining the dependence structure in radiomics data derived from The Cancer Imaging Atlas. Full Article
z Implicit Copulas from Bayesian Regularized Regression Smoothers By projecteuclid.org Published On :: Thu, 19 Dec 2019 22:10 EST Nadja Klein, Michael Stanley Smith. Source: Bayesian Analysis, Volume 14, Number 4, 1143--1171.Abstract: We show how to extract the implicit copula of a response vector from a Bayesian regularized regression smoother with Gaussian disturbances. The copula can be used to compare smoothers that employ different shrinkage priors and function bases. We illustrate with three popular choices of shrinkage priors—a pairwise prior, the horseshoe prior and a g prior augmented with a point mass as employed for Bayesian variable selection—and both univariate and multivariate function bases. The implicit copulas are high-dimensional, have flexible dependence structures that are far from that of a Gaussian copula, and are unavailable in closed form. However, we show how they can be evaluated by first constructing a Gaussian copula conditional on the regularization parameters, and then integrating over these. Combined with non-parametric margins the regularized smoothers can be used to model the distribution of non-Gaussian univariate responses conditional on the covariates. Efficient Markov chain Monte Carlo schemes for evaluating the copula are given for this case. Using both simulated and real data, we show how such copula smoothing models can improve the quality of resulting function estimates and predictive distributions. Full Article
z Bayesian Zero-Inflated Negative Binomial Regression Based on Pólya-Gamma Mixtures By projecteuclid.org Published On :: Tue, 11 Jun 2019 04:00 EDT Brian Neelon. Source: Bayesian Analysis, Volume 14, Number 3, 849--875.Abstract: Motivated by a study examining spatiotemporal patterns in inpatient hospitalizations, we propose an efficient Bayesian approach for fitting zero-inflated negative binomial models. To facilitate posterior sampling, we introduce a set of latent variables that are represented as scale mixtures of normals, where the precision terms follow independent Pólya-Gamma distributions. Conditional on the latent variables, inference proceeds via straightforward Gibbs sampling. For fixed-effects models, our approach is comparable to existing methods. However, our model can accommodate more complex data structures, including multivariate and spatiotemporal data, settings in which current approaches often fail due to computational challenges. Using simulation studies, we highlight key features of the method and compare its performance to other estimation procedures. We apply the approach to a spatiotemporal analysis examining the number of annual inpatient admissions among United States veterans with type 2 diabetes. Full Article
z Fast Model-Fitting of Bayesian Variable Selection Regression Using the Iterative Complex Factorization Algorithm By projecteuclid.org Published On :: Wed, 13 Mar 2019 22:00 EDT Quan Zhou, Yongtao Guan. Source: Bayesian Analysis, Volume 14, Number 2, 573--594.Abstract: Bayesian variable selection regression (BVSR) is able to jointly analyze genome-wide genetic datasets, but the slow computation via Markov chain Monte Carlo (MCMC) hampered its wide-spread usage. Here we present a novel iterative method to solve a special class of linear systems, which can increase the speed of the BVSR model-fitting tenfold. The iterative method hinges on the complex factorization of the sum of two matrices and the solution path resides in the complex domain (instead of the real domain). Compared to the Gauss-Seidel method, the complex factorization converges almost instantaneously and its error is several magnitude smaller than that of the Gauss-Seidel method. More importantly, the error is always within the pre-specified precision while the Gauss-Seidel method is not. For large problems with thousands of covariates, the complex factorization is 10–100 times faster than either the Gauss-Seidel method or the direct method via the Cholesky decomposition. In BVSR, one needs to repetitively solve large penalized regression systems whose design matrices only change slightly between adjacent MCMC steps. This slight change in design matrix enables the adaptation of the iterative complex factorization method. The computational innovation will facilitate the wide-spread use of BVSR in reanalyzing genome-wide association datasets. Full Article
z Constrained Bayesian Optimization with Noisy Experiments By projecteuclid.org Published On :: Wed, 13 Mar 2019 22:00 EDT Benjamin Letham, Brian Karrer, Guilherme Ottoni, Eytan Bakshy. Source: Bayesian Analysis, Volume 14, Number 2, 495--519.Abstract: Randomized experiments are the gold standard for evaluating the effects of changes to real-world systems. Data in these tests may be difficult to collect and outcomes may have high variance, resulting in potentially large measurement error. Bayesian optimization is a promising technique for efficiently optimizing multiple continuous parameters, but existing approaches degrade in performance when the noise level is high, limiting its applicability to many randomized experiments. We derive an expression for expected improvement under greedy batch optimization with noisy observations and noisy constraints, and develop a quasi-Monte Carlo approximation that allows it to be efficiently optimized. Simulations with synthetic functions show that optimization performance on noisy, constrained problems outperforms existing methods. We further demonstrate the effectiveness of the method with two real-world experiments conducted at Facebook: optimizing a ranking system, and optimizing server compiler flags. Full Article
z Efficient Bayesian Regularization for Graphical Model Selection By projecteuclid.org Published On :: Wed, 13 Mar 2019 22:00 EDT Suprateek Kundu, Bani K. Mallick, Veera Baladandayuthapani. Source: Bayesian Analysis, Volume 14, Number 2, 449--476.Abstract: There has been an intense development in the Bayesian graphical model literature over the past decade; however, most of the existing methods are restricted to moderate dimensions. We propose a novel graphical model selection approach for large dimensional settings where the dimension increases with the sample size, by decoupling model fitting and covariance selection. First, a full model based on a complete graph is fit under a novel class of mixtures of inverse–Wishart priors, which induce shrinkage on the precision matrix under an equivalence with Cholesky-based regularization, while enabling conjugate updates. Subsequently, a post-fitting model selection step uses penalized joint credible regions to perform model selection. This allows our methods to be computationally feasible for large dimensional settings using a combination of straightforward Gibbs samplers and efficient post-fitting inferences. Theoretical guarantees in terms of selection consistency are also established. Simulations show that the proposed approach compares favorably with competing methods, both in terms of accuracy metrics and computation times. We apply this approach to a cancer genomics data example. Full Article
z Gaussianization Machines for Non-Gaussian Function Estimation Models By projecteuclid.org Published On :: Wed, 08 Jan 2020 04:00 EST T. Tony Cai. Source: Statistical Science, Volume 34, Number 4, 635--656.Abstract: A wide range of nonparametric function estimation models have been studied individually in the literature. Among them the homoscedastic nonparametric Gaussian regression is arguably the best known and understood. Inspired by the asymptotic equivalence theory, Brown, Cai and Zhou ( Ann. Statist. 36 (2008) 2055–2084; Ann. Statist. 38 (2010) 2005–2046) and Brown et al. ( Probab. Theory Related Fields 146 (2010) 401–433) developed a unified approach to turn a collection of non-Gaussian function estimation models into a standard Gaussian regression and any good Gaussian nonparametric regression method can then be used. These Gaussianization Machines have two key components, binning and transformation. When combined with BlockJS, a wavelet thresholding procedure for Gaussian regression, the procedures are computationally efficient with strong theoretical guarantees. Technical analysis given in Brown, Cai and Zhou ( Ann. Statist. 36 (2008) 2055–2084; Ann. Statist. 38 (2010) 2005–2046) and Brown et al. ( Probab. Theory Related Fields 146 (2010) 401–433) shows that the estimators attain the optimal rate of convergence adaptively over a large set of Besov spaces and across a collection of non-Gaussian function estimation models, including robust nonparametric regression, density estimation, and nonparametric regression in exponential families. The estimators are also spatially adaptive. The Gaussianization Machines significantly extend the flexibility and scope of the theories and methodologies originally developed for the conventional nonparametric Gaussian regression. This article aims to provide a concise account of the Gaussianization Machines developed in Brown, Cai and Zhou ( Ann. Statist. 36 (2008) 2055–2084; Ann. Statist. 38 (2010) 2005–2046), Brown et al. ( Probab. Theory Related Fields 146 (2010) 401–433). Full Article
z Comment: “Models as Approximations I: Consequences Illustrated with Linear Regression” by A. Buja, R. Berk, L. Brown, E. George, E. Pitkin, L. Zhan and K. Zhang By projecteuclid.org Published On :: Wed, 08 Jan 2020 04:00 EST Roderick J. Little. Source: Statistical Science, Volume 34, Number 4, 580--583. Full Article
z ROS Regression: Integrating Regularization with Optimal Scaling Regression By projecteuclid.org Published On :: Fri, 11 Oct 2019 04:03 EDT Jacqueline J. Meulman, Anita J. van der Kooij, Kevin L. W. Duisters. Source: Statistical Science, Volume 34, Number 3, 361--390.Abstract: We present a methodology for multiple regression analysis that deals with categorical variables (possibly mixed with continuous ones), in combination with regularization, variable selection and high-dimensional data ($Pgg N$). Regularization and optimal scaling (OS) are two important extensions of ordinary least squares regression (OLS) that will be combined in this paper. There are two data analytic situations for which optimal scaling was developed. One is the analysis of categorical data, and the other the need for transformations because of nonlinear relationships between predictors and outcome. Optimal scaling of categorical data finds quantifications for the categories, both for the predictors and for the outcome variables, that are optimal for the regression model in the sense that they maximize the multiple correlation. When nonlinear relationships exist, nonlinear transformation of predictors and outcome maximize the multiple correlation in the same way. We will consider a variety of transformation types; typically we use step functions for categorical variables, and smooth (spline) functions for continuous variables. Both types of functions can be restricted to be monotonic, preserving the ordinal information in the data. In combination with optimal scaling, three popular regularization methods will be considered: Ridge regression, the Lasso and the Elastic Net. The resulting method will be called ROS Regression (Regularized Optimal Scaling Regression). The OS algorithm provides straightforward and efficient estimation of the regularized regression coefficients, automatically gives the Group Lasso and Blockwise Sparse Regression, and extends them by the possibility to maintain ordinal properties in the data. Extended examples are provided. Full Article
z Statistical Analysis of Zero-Inflated Nonnegative Continuous Data: A Review By projecteuclid.org Published On :: Thu, 18 Jul 2019 22:01 EDT Lei Liu, Ya-Chen Tina Shih, Robert L. Strawderman, Daowen Zhang, Bankole A. Johnson, Haitao Chai. Source: Statistical Science, Volume 34, Number 2, 253--279.Abstract: Zero-inflated nonnegative continuous (or semicontinuous) data arise frequently in biomedical, economical, and ecological studies. Examples include substance abuse, medical costs, medical care utilization, biomarkers (e.g., CD4 cell counts, coronary artery calcium scores), single cell gene expression rates, and (relative) abundance of microbiome. Such data are often characterized by the presence of a large portion of zero values and positive continuous values that are skewed to the right and heteroscedastic. Both of these features suggest that no simple parametric distribution may be suitable for modeling such type of outcomes. In this paper, we review statistical methods for analyzing zero-inflated nonnegative outcome data. We will start with the cross-sectional setting, discussing ways to separate zero and positive values and introducing flexible models to characterize right skewness and heteroscedasticity in the positive values. We will then present models of correlated zero-inflated nonnegative continuous data, using random effects to tackle the correlation on repeated measures from the same subject and that across different parts of the model. We will also discuss expansion to related topics, for example, zero-inflated count and survival data, nonlinear covariate effects, and joint models of longitudinal zero-inflated nonnegative continuous data and survival. Finally, we will present applications to three real datasets (i.e., microbiome, medical costs, and alcohol drinking) to illustrate these methods. Example code will be provided to facilitate applications of these methods. Full Article
z Generalized Multiple Importance Sampling By projecteuclid.org Published On :: Fri, 12 Apr 2019 04:00 EDT Víctor Elvira, Luca Martino, David Luengo, Mónica F. Bugallo. Source: Statistical Science, Volume 34, Number 1, 129--155.Abstract: Importance sampling (IS) methods are broadly used to approximate posterior distributions or their moments. In the standard IS approach, samples are drawn from a single proposal distribution and weighted adequately. However, since the performance in IS depends on the mismatch between the targeted and the proposal distributions, several proposal densities are often employed for the generation of samples. Under this multiple importance sampling (MIS) scenario, extensive literature has addressed the selection and adaptation of the proposal distributions, interpreting the sampling and weighting steps in different ways. In this paper, we establish a novel general framework with sampling and weighting procedures when more than one proposal is available. The new framework encompasses most relevant MIS schemes in the literature, and novel valid schemes appear naturally. All the MIS schemes are compared and ranked in terms of the variance of the associated estimators. Finally, we provide illustrative examples revealing that, even with a good choice of the proposal densities, a careful interpretation of the sampling and weighting procedures can make a significant difference in the performance of the method. Full Article
z No smoking zone / design : Biman Mullick. By search.wellcomelibrary.org Published On :: London : Cleanair, Smoke-free Environment (33 Stillness Rd, London, SE23 1NG), [198-?] Full Article
z No smoking zone / design : Biman Mullick. By search.wellcomelibrary.org Published On :: London : Cleanair, Smoke-free Environment (33 Stillness Rd, London, SE23 1NG), [198-?] Full Article
z Zona de no fumar / deseño : Biman Mullick. By search.wellcomelibrary.org Published On :: London : Cleanair Campaña para un Medio Ambiente Libre de Humo, [198-?] Full Article
z Gabrielle Union's Mesmerizing Tie Dye Activewear Set Is On Sale for Black Friday By www.health.com Published On :: Wed, 27 Nov 2019 15:35:02 -0500 The rainbow sports bra and leggings set from Splits59 is a must-have for anyone craving a pop of color in their workout wardrobe. Full Article
z Jennifer Lopez Is Wearing the Hell Out of These $60 Sneakers—and You Can Buy Them at Zappos By www.health.com Published On :: Mon, 22 Jul 2019 17:56:20 -0400 The chic sneaks are part of Zappos' massive Cyber Monday sale. Full Article
z The Comfy Sneakers That Kate Middleton, Kelly Ripa, and More Celebs Love Are on Sale at Amazon By www.health.com Published On :: Mon, 02 Dec 2019 17:12:09 -0500 Keep your feet comfy and your wallet fat. Full Article
z Sweatsuits Should Be Your Cozy Day Uniform—and These Are Our Favorites From Amazon By www.health.com Published On :: Wed, 11 Dec 2019 11:39:15 -0500 This retro style is making a comeback for a reason. Full Article
z Of Course Katie Holmes Found This Year’s Coziest Winter Boot By www.health.com Published On :: Thu, 12 Dec 2019 12:51:00 -0500 Keep your feet happy this winter. Full Article
z Jennifer Lopez Just Stepped Out in These Glittery Leggings (Again)—and We Found Them on Sale By www.health.com Published On :: Thu, 12 Dec 2019 16:14:56 -0500 They’re already going out of stock. Full Article
z Nike Launches Zoom Pulse Sneakers for Medical Workers Who Are On Their Feet All Day By www.health.com Published On :: Fri, 13 Dec 2019 13:45:17 -0500 The new style is available to shop today. Full Article
z Amazon Just Launched an Exclusive Clothing Collection Full of Warm and Comfy Basics Under $45 By www.health.com Published On :: Tue, 17 Dec 2019 12:23:05 -0500 The womenswear line is new, and there’s already a variety of items to shop. Full Article
z Allometric Analysis Detects Brain Size-Independent Effects of Sex and Sex Chromosome Complement on Human Cerebellar Organization By www.jneurosci.org Published On :: 2017-05-24 Catherine MankiwMay 24, 2017; 37:5221-5231Development Plasticity Repair Full Article
z Optimization of a GCaMP Calcium Indicator for Neural Activity Imaging By www.jneurosci.org Published On :: 2012-10-03 Jasper AkerboomOct 3, 2012; 32:13819-13840Cellular Full Article