用户名: 密码: 验证码:
The Matrix Generalized Inverse Gaussian Distribution: Properties and Applications
详细信息    查看全文
  • 刊名:Lecture Notes in Computer Science
  • 出版年:2016
  • 出版时间:2016
  • 年:2016
  • 卷:9851
  • 期:1
  • 页码:648-664
  • 全文大小:646 KB
  • 参考文献:1.Anderson, B., Moore, J.: Optimal control: linear quadratic methods (2007)
    2.Bai, Z., Demmel, J.: Using the matrix sign function to compute invariant subspaces. SIAM J. Matrix Anal. Appl. 19(1), 205–225 (1998)MathSciNet CrossRef MATH
    3.Barndorff-Nielsen, O., Blæsild, P., et al.: Exponential transformation models. Proc. Roy. Soc. London Ser. A 379(1776), 41–65 (1982)MathSciNet CrossRef MATH
    4.Bishop, C.M.: Bayesian PCA. NIPS 11, 382–388 (1999)
    5.Bishop, C.M.: Variational principal components. In: ICANN (1999)
    6.Blei, D., Cook, P., Hoffman, M.: Bayesian nonparametric matrix factorization for recorded music. In: ICML, pp. 439–446 (2010)
    7.Boyd, S., Barratt, C.: Linear controller design: limits of performance (1991)
    8.Brevern, A.D., Hazout, S., Malpertuy, A.: Influence of microarrays experiments missing values on the stability of gene groups by hierarchical clustering. BMC Bioinform. 5(1), 114 (2004)CrossRef
    9.Bunse-Gerstner, A., Mehrmann, V.: A symplectic QR like algorithm for the solution of the real algebraic Riccati equation. IEEE Trans. Autom. Control 31, 1104–1113 (1986)MathSciNet CrossRef MATH
    10.Butler, R.W.: Generalized inverse Gaussian distributions and their Wishart connections. Scand. J. Statist. 25(1), 69–75 (1998)MathSciNet CrossRef MATH
    11.Byers, R.: Solving the algebraic Riccati equation with the matrix sign function. Linear Algebra Appl. 85, 267–279 (1987)MathSciNet CrossRef MATH
    12.Eberlein, E., Keller, U.: Hyperbolic distributions in finance. Bernoulli 1, 281–299 (1995)CrossRef MATH
    13.Herz, C.: Bessel functions of matrix argument. Ann. Math. 23, 77–87 (1955)MathSciNet MATH
    14.Jørgensen, B.: Statistical properties of the generalized inverse Gaussian distribution
    15.Kong, A., Liu, J., Wong, W.: Sequential imputations and bayesian missing data problems. JASA 89(425), 278–288 (1994)CrossRef MATH
    16.Laub, A.: A Schur method for solving algebraic Riccati equations. IEEE Trans. Autom. Control 24(6), 913–921 (1979)MathSciNet CrossRef MATH
    17.Lawrence, N.: Probabilistic non-linear principal component analysis with Gaussian process latent variable models. JMLR 6, 1783–1816 (2005)MathSciNet MATH
    18.Lawrence, N., Urtasun, R.: Non-linear matrix factorization with gaussian processes. In: ICML (2009)
    19.Li, T., Chu, E., et al.: Solving large-scale continuous-time algebraic Riccati equations by doubling. J. Comput. Appl. Math. 237(1), 373–383 (2013)MathSciNet CrossRef MATH
    20.Li, Y., Yang, M., Qi, Z., Zhang, Z.: Bayesian multi-task relationship learning with link structure. In: ICDM (2013)
    21.MacKay, D.: Information theory, inference, and learning algorithms (2003)
    22.Minka, T.P.: Automatic choice of dimensionality for PCA. In: NIPS (2000)
    23.Owen, A.: Monte Carlo theory, methods and examples (2013)
    24.Salakhutdinov, R., Mnih, A.: Probabilistic matrix factorization. In: NIPS (2007)
    25.Salakhutdinov, R., Mnih, A.: Bayesian probabilistic matrix factorization using markov chain monte carlo. In: ICML (2008)
    26.Seshadri, V.: Some properties of the matrix generalized inverse Gaussian distribution. Stat. Methods Pract. 69, 47–56 (2003)MathSciNet
    27.Seshadri, V., Wesołowski, J.: More on connections between Wishart and matrix GIG distributions. Metrika 68(2), 219–232 (2008)MathSciNet CrossRef MATH
    28.Smith, W., Hocking, R.: Algorithm as 53: Wishart variate generator. Appl. Statist. 21, 341–345 (1972)CrossRef
    29.Tipping, M.E., Bishop, C.M.: Probabilistic principal component analysis. J. Royal Statist. Soc. Seri. B 61(3), 611–622 (1999)MathSciNet CrossRef MATH
    30.Wang, H., Fazayeli, F., et al.: Gaussian copula precision estimation with missing values. In: AISTATS (2014)
    31.Wishart, J.: The generalised product moment distribution in samples from a normal multivariate population. Biometrika 20A, 32–52 (1928)CrossRef MATH
    32.Yang, M., Li, Y., Zhang, Z.: Multi-task learning with Gaussian matrix generalized inverse Gaussian model. In: ICML (2013)
    33.Yoshii, K., Tomioka, R.: Infinite positive semidefinite tensor factorization for source separation of mixture signals. In: ICML (2013)
  • 作者单位:Farideh Fazayeli (17)
    Arindam Banerjee (17)

    17. Department of Computer Science and Engineering, University of Minnesota, Twin Cities, USA
  • 丛书名:Machine Learning and Knowledge Discovery in Databases
  • ISBN:978-3-319-46128-1
  • 刊物类别:Computer Science
  • 刊物主题:Artificial Intelligence and Robotics
    Computer Communication Networks
    Software Engineering
    Data Encryption
    Database Management
    Computation by Abstract Devices
    Algorithm Analysis and Problem Complexity
  • 出版者:Springer Berlin / Heidelberg
  • ISSN:1611-3349
  • 卷排序:9851
文摘
While the Matrix Generalized Inverse Gaussian (\(\mathcal {MGIG}\)) distribution arises naturally in some settings as a distribution over symmetric positive semi-definite matrices, certain key properties of the distribution and effective ways of sampling from the distribution have not been carefully studied. In this paper, we show that the \(\mathcal {MGIG}\) is unimodal, and the mode can be obtained by solving an Algebraic Riccati Equation (ARE) equation [7]. Based on the property, we propose an importance sampling method for the \(\mathcal {MGIG}\) where the mode of the proposal distribution matches that of the target. The proposed sampling method is more efficient than existing approaches [32, 33], which use proposal distributions that may have the mode far from the \(\mathcal {MGIG}\)’s mode. Further, we illustrate that the the posterior distribution in latent factor models, such as probabilistic matrix factorization (PMF) [24], when marginalized over one latent factor has the \(\mathcal {MGIG}\) distribution. The characterization leads to a novel Collapsed Monte Carlo (CMC) inference algorithm for such latent factor models. We illustrate that CMC has a lower log loss or perplexity than MCMC, and needs fewer samples.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700