Now showing items 13-29 of 29

    • Latent Gaussian Processes for Distribution Estimation of Multivariate Categorical Data 

      Gal, Yarin; Chen, Yutian; Ghahramani, Zoubin (Microtome Publishing, 2015)
      Multivariate categorical data occur in many applications of machine learning. One of the main difficulties with these vectors of categorical variables is sparsity. The number of possible observations grows exponentially ...
    • Linear Dimensionality Reduction: Survey, Insights, and Generalizations 

      Cunningham, John P; Ghahramani, Zoubin (MIT Press, 2015-12-01)
      Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data ...
    • Lost Relatives of the Gumbel Trick 

      Balog, Matej; Tripuraneni, N; Ghahramani, Zoubin; Weller, Adrian Vivian
      The Gumbel trick is a method to sample from a discrete probability distribution, or to estimate its normalizing partition function. The method re- lies on repeatedly applying a random perturbation to the distribution in a ...
    • MCMC for Variationally Sparse Gaussian Processes 

      Hensman, James; Matthews, Alexander; Filippone, Maurizio; Ghahramani, Zoubin (Neural Information Processing Systems Foundation, 2015-12-07)
      Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is ...
    • The Mondrian Kernel 

      Balog, Matej; Lakshminarayanan, B; Ghahramani, Zoubin; Roy, DM; Teh, YW (Association for Uncertainty in Artificial Intelligence Press, 2016-06-29)
      We introduce the Mondrian kernel, a fast $\textit{random feature}$ approximation to the Laplace kernel. It is suitable for both batch and online learning, and admits a fast kernel-width-selection procedure as the random ...
    • Neural Adaptive Sequential Monte Carlo 

      Gu, Shixiang; Ghahramani, Zoubin; Turner, Richard E. (Curran Associates, 2015)
      Sequential Monte Carlo (SMC), or particle filtering, is a popular class of methods for sampling from an intractable target distribution using a sequence of simpler intermediate distributions. Like other importance ...
    • Neural adaptive sequential Monte Carlo 

      Gu, S; Ghahramani, Zoubin; Turner, Richard Eric (2015-01-01)
    • On Sparse variational methods and the Kullback-Leibler divergence between stochastic processes 

      Matthews, Alexander; Hensman, James; Turner, Richard Eric; Ghahramani, Zoubin
    • One-Shot Learning in Discriminative Neural Networks 

      Burgess, Jordan; Lloyd, James Robert; Ghahramani, Zoubin
    • Particle Gibbs for Infinite Hidden Markov Models 

      Tripuranen, Nilesh; Gu, Shixiang; Ge, Hong; Ghahramani, Zoubin (Curran Associates, 2015-12-18)
      Infinite Hidden Markov Models (iHMM’s) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system. However, due to the ...
    • Practical Probabilistic Programming with Monads 

      Scibior, Adam; Ghahramani, Zoubin; Gordon, Andrew D (ACM, 2015-07-30)
      The machine learning community has recently shown a lot of interest in practical probabilistic programming systems that target the problem of Bayesian inference. Such systems come in different forms, but they all express ...
    • Predictive Entropy Search for Bayesian Optimization with Unknown Constraints 

      Hernández-Lobato, José Miguel; Gelbart, Michael A; Hoffman, Matthew W; Adams, Ryan P; Ghahramani, Zoubin (JMLR, 2015-06-01)
      Unknown constraints arise in many types of expensive black-box optimization problems. Several methods have been proposed recently for performing Bayesian optimization with constraints, based on the expected improvement ...
    • Probabilistic machine learning and artificial intelligence 

      Ghahramani, Zoubin (NPG, 2015-05-27)
      How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing ...
    • Q-Prop: Sample-Efficient Policy Gradient with An Off-Policy Critic 

      Gu, Shixiang; Lillicrap, Timothy; Ghahramani, Zoubin; Turner, Richard Eric; Levine, Sergey
    • R/BHC: fast Bayesian hierarchical clustering for microarray data 

      Savage, R; Heller, Katherine Ann; Xu, Y; Ghahramani, Zoubin; Truman, W; Grant, M; Denby, K et al. (2009-08-06)
      Abstract Background Although the use of clustering methods has rapidly become one of the standard computational approaches in the literature of microarray gene expression data analysis, little attention has been paid to ...
    • Scalable Discrete Sampling as a Multi-Armed Bandit Problem 

      Chen, Yutian; Ghahramani, Zoubin (2016)
    • Scalable Variational Gaussian Process Classification 

      Hensman, James; Matthews, Alexander; Ghahramani, Zoubin (JMLR, 2015-02-21)
      Gaussian process classification is a popular method with a number of appealing properties. We show how to scale the model within a variational inducing point framework, outperforming the state of the art on benchmark ...