A Blog Entry on Bayesian Computation by an Applied Mathematician
$$
$$
1 導入
Kernel | NN | |
---|---|---|
潜在空間 | 無限 | 有限 |
基底関数 | 固定 | 適応的 |
カーネル法の強みは,ノンパラメトリックモデリングを行った場合にある.
References
Li, Z., Meunier, D., Mollenhauer, M., and Gretton, A. (2022). Optimal rates for regularized conditional mean embedding learning. In S. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, and A. Oh, editors, Advances in neural information processing systems,Vol. 35, pages 4433–4445. Curran Associates, Inc.
Park, J., and Muandet, K. (2020). A measure-theoretic approach to kernel conditional mean embeddings. In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin, editors, Advances in neural information processing systems,Vol. 33, pages 21247–21259. Curran Associates, Inc.
Song, L., Fukumizu, K., and Gretton, A. (2013). Kernel Embeddings of Conditional Distributions: A Unified Kernel Framework for Nonparametric Inference in Graphical Models. IEEE Signal Processing Magazine, 30(4), 98–111.
Song, L., Huang, J., Smola, A., and Fukumizu, K. (2009). Hilbert space embeddings of conditional distributions with applications to dynamical systems. In Proceedings of the 26th annual international conference on machine learning, pages 961–968. New York, NY, USA: Association for Computing Machinery.
Footnotes
(Li et al., 2022, p. 5) Def.2, (Park and Muandet, 2020, p. 4) Def.3.1↩︎