Now showing items 8-27 of 27

• #### A General Framework for Constrained Bayesian Optimization using Information-based Search ﻿

(MIT Press, 2016-09-24)
We present an information-theoretic framework for solving global black-box optimization problems that also have black-box constraints. Of particular interest to us is to efficiently solve problems with decoupled constraints, ...
• #### A General Framework for Constrained Bayesian Optimization using Information-based Search ﻿

(Journal of Machine Learning Research, 2016-09-24)
We present an information-theoretic framework for solving global black-box optimization problems that also have black-box constraints. Of particular interest to us is to efficiently solve problems with $\textit{decoupled}$ ...
• #### Improving PPM with dynamic parameter updates ﻿

(IEEE, 2015)
This article makes several improvements to the classic PPM algorithm, resulting in a new algorithm with superior compression effectiveness on human text. The key differences of our algorithm to classic PPM are that (A) ...
• #### Improving PPM with dynamic parameter updates ﻿

(2015-03-25)

(2015)
• #### Linear Dimensionality Reduction: Survey, Insights, and Generalizations ﻿

(MIT Press, 2015-12-01)
Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data ...
• #### Lost Relatives of the Gumbel Trick ﻿

The Gumbel trick is a method to sample from a discrete probability distribution, or to estimate its normalizing partition function. The method re- lies on repeatedly applying a random perturbation to the distribution in a ...
• #### MCMC for Variationally Sparse Gaussian Processes ﻿

(Neural Information Processing Systems Foundation, 2015-12-07)
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is ...
• #### The Mondrian Kernel ﻿

(Association for Uncertainty in Artificial Intelligence Press, 2016-06-29)
We introduce the Mondrian kernel, a fast $\textit{random feature}$ approximation to the Laplace kernel. It is suitable for both batch and online learning, and admits a fast kernel-width-selection procedure as the random ...
• #### Neural Adaptive Sequential Monte Carlo ﻿

(Curran Associates, 2015)
Sequential Monte Carlo (SMC), or particle filtering, is a popular class of methods for sampling from an intractable target distribution using a sequence of simpler intermediate distributions. Like other importance ...
• #### Neural adaptive sequential Monte Carlo ﻿

(2015-01-01)

• #### Particle Gibbs for Infinite Hidden Markov Models ﻿

(Curran Associates, 2015-12-18)
Infinite Hidden Markov Models (iHMM’s) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system. However, due to the ...
• #### Practical Probabilistic Programming with Monads ﻿

(ACM, 2015-07-30)
The machine learning community has recently shown a lot of interest in practical probabilistic programming systems that target the problem of Bayesian inference. Such systems come in different forms, but they all express ...
• #### Predictive Entropy Search for Bayesian Optimization with Unknown Constraints ﻿

(2015-06-01)
• #### Probabilistic machine learning and artificial intelligence ﻿

(NPG, 2015-05-27)
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing ...
• #### R/BHC: fast Bayesian hierarchical clustering for microarray data ﻿

(2009-08-06)
Abstract Background Although the use of clustering methods has rapidly become one of the standard computational approaches in the literature of microarray gene expression data analysis, little attention has been paid to ...

(2016)
• #### Scalable Variational Gaussian Process Classification ﻿

(JMLR, 2015-02-21)
Gaussian process classification is a popular method with a number of appealing properties. We show how to scale the model within a variational inducing point framework, outperforming the state of the art on benchmark ...