<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>書籍（分担執筆） on 池田 思朗</title><link>https://ikeda46.github.io/ja/tags/%E6%9B%B8%E7%B1%8D%E5%88%86%E6%8B%85%E5%9F%B7%E7%AD%86/</link><description>Recent content in 書籍（分担執筆） on 池田 思朗</description><generator>Hugo</generator><language>ja</language><lastBuildDate>Wed, 01 Feb 2017 00:00:00 +0000</lastBuildDate><atom:link href="https://ikeda46.github.io/ja/tags/%E6%9B%B8%E7%B1%8D%E5%88%86%E6%8B%85%E5%9F%B7%E7%AD%86/index.xml" rel="self" type="application/rss+xml"/><item><title>スパース性を用いた推定</title><link>https://ikeda46.github.io/ja/posts/2017.02.ikeda.iwanamids/</link><pubDate>Wed, 01 Feb 2017 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2017.02.ikeda.iwanamids/</guid><description>&lt;p>『「特集」スパースモデリングと多変量データ解析』岩波データサイエンス刊行委員会 編, 岩波データサイエンス Vol. 5, pp. 19–38.&lt;/p>
&lt;p>ISBN: 978-4-00-029855-1&lt;/p>
&lt;p>岩波書店.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>池田 思朗&lt;/li>
&lt;/ul>
&lt;h3 id="キーワード">キーワード:&lt;/h3>
&lt;ul>
&lt;li>スパース推定&lt;/li>
&lt;/ul></description></item><item><title>ターボ復号法の情報幾何的理解と改善の可能性</title><link>https://ikeda46.github.io/ja/posts/2006.09.ikeda.smapipbook/</link><pubDate>Fri, 01 Sep 2006 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2006.09.ikeda.smapipbook/</guid><description>&lt;p>「確率的情報処理と統計力学」 田中和之編, SGC ライブラリ 50, pp. 52–58.&lt;/p>
&lt;p>サイエンス社.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>池田 思朗&lt;/li>
&lt;/ul>
&lt;h3 id="キーワード">キーワード:&lt;/h3>
&lt;ul>
&lt;li>確率伝搬法&lt;/li>
&lt;li>情報幾何&lt;/li>
&lt;li>ターボ符号&lt;/li>
&lt;/ul></description></item><item><title>EM Algorithm in Neural Network Learning</title><link>https://ikeda46.github.io/ja/posts/2003.10.murataikeda.embook/</link><pubDate>Wed, 01 Oct 2003 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2003.10.murataikeda.embook/</guid><description>&lt;p>In &lt;em>The EM Algorithm and Related Statistical Models&lt;/em> (STATISTICS: A Dekker series of Textbooks and Monographs 170.), Ed. by Michiko Watanabe and Kazunori Yamaguchi, Chap. 8, pp. 95–126.&lt;/p>
&lt;p>ISBN: 0824747011&lt;/p>
&lt;p>New York, NY/Basel: Marcel Dekker, Inc.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>Noboru Murata&lt;/li>
&lt;li>Shiro Ikeda&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="abstract">Abstract:&lt;/h3>
&lt;p>(From ``introduction&amp;rsquo;&amp;rsquo;) In this article, we first review the EM algorithm from the geometrical viewpoint based on the em algorithm proposed in [7]. This geometrical concept is important to interpret various learning rules in neural networks. Then we give some examples of neural network models in which the EM algorithm implicitly appears in the learning process. From the biological point of view, it is an important problem that the EM algorithm can be appropriately implemented on real biological systems. This is usually hard, and with a special model, the Helmholtz machine, we shortly discuss the tradeoff between statistical models and biological models. In the end, we show two models of neural networks in which the EM algorithm is adopted for learning explicitly. These models are proposed mainly for practical applications, not for biological modeling, and they are applied for complicated tasks such as controlling robots.&lt;/p></description></item><item><title>生体信号処理とノイズ</title><link>https://ikeda46.github.io/ja/posts/2002.11.ikeda.icabook/</link><pubDate>Fri, 01 Nov 2002 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2002.11.ikeda.icabook/</guid><description>&lt;p>「独立成分分析」甘利俊一, 村田昇 編, SGC ライブラリ18, Chap. 10, pp. 71–78.&lt;/p>
&lt;p>サイエンス社.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>池田 思朗&lt;/li>
&lt;/ul>
&lt;h3 id="キーワード">キーワード:&lt;/h3>
&lt;ul>
&lt;li>独立成分分析&lt;/li>
&lt;/ul></description></item><item><title>隠れ状態最尤推定と反復推定法 –EMアルゴリズムと Wake-Sleep</title><link>https://ikeda46.github.io/ja/posts/2002.03.ikeda.asakura/</link><pubDate>Fri, 01 Mar 2002 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2002.03.ikeda.asakura/</guid><description>&lt;p>「脳の情報表現–ニューロン・ネットワーク・数理モデル」銅谷賢治, 伊藤浩之, 藤井宏, 塚田稔 編, Chap. 9, pp. 98–106.&lt;/p>
&lt;p>朝倉書店.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>池田 思朗&lt;/li>
&lt;/ul>
&lt;h3 id="キーワード">キーワード:&lt;/h3>
&lt;ul>
&lt;li>EMアルゴリズム&lt;/li>
&lt;li>Wake-Sleep アルゴリズム&lt;/li>
&lt;/ul></description></item><item><title>Information Geometry and Mean Field Approximation: The $\alpha$-projection Approach</title><link>https://ikeda46.github.io/ja/posts/2001.02.amari_etal.mit/</link><pubDate>Thu, 01 Feb 2001 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2001.02.amari_etal.mit/</guid><description>&lt;p>In &lt;em>Advanced Mean Field Methods – Theory and Practice&lt;/em>, Ed. by Manfred Opper and David Saad, Chap. 16, pp. 241–257.&lt;/p>
&lt;p>ISBN: 0262150549.&lt;/p>
&lt;p>Cambridge, MA: MIT Press.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>Shun-ichi Amari&lt;/li>
&lt;li>Shiro Ikeda&lt;/li>
&lt;li>Hidetoshi Shimokawa&lt;/li>
&lt;/ul>
&lt;hr>
&lt;h3 id="abstract">Abstract:&lt;/h3>
&lt;p>Information geometry is applied to mean field approximation for elucidating its properties in the spin glass model or the Boltzmann machine. The $\alpha$-divergence is used for approximation, where $\alpha$-geodesic projection plays an important role. The naive mean field approximation and TAP approximation are studied from the point of view of information geometry, which treats the intrinsic geometric structures of a family of probability distributions. The bifurcation of the $\alpha$-projection is studied, at which the uniqueness of the $\alpha$-approximation is broken.&lt;/p></description></item><item><title>ICA on Noisy Data: A Factor Analysis Approach</title><link>https://ikeda46.github.io/ja/posts/2000.06.girolami.springer/</link><pubDate>Thu, 01 Jun 2000 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2000.06.girolami.springer/</guid><description>&lt;p>In &lt;em>Advances in Independent Component Analysis&lt;/em>, Ed. by Mark Girolami, Chap. 11, pp. 201–215.&lt;/p>
&lt;p>ISBN: 1852332638.&lt;/p>
&lt;p>Springer-Verlag London Ltd.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>Shiro Ikeda&lt;/li>
&lt;/ul></description></item><item><title>ニューラルネットとEM</title><link>https://ikeda46.github.io/ja/posts/2000.02.murataikeda.em/</link><pubDate>Tue, 01 Feb 2000 00:00:00 +0000</pubDate><guid>https://ikeda46.github.io/ja/posts/2000.02.murataikeda.em/</guid><description>&lt;p>「EMアルゴリズムと不完全データの諸問題」渡辺美智子, 山口和範 編, Chap. 8, pp. 155–188.&lt;/p>
&lt;p>多賀出版.&lt;/p>
&lt;h3 id="著者">著者:&lt;/h3>
&lt;ul>
&lt;li>村田 昇&lt;/li>
&lt;li>池田 思朗&lt;/li>
&lt;/ul>
&lt;h3 id="キーワード">キーワード:&lt;/h3>
&lt;ul>
&lt;li>EM アルゴリズム&lt;/li>
&lt;/ul></description></item></channel></rss>