<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Machine_learning on Sawyer Zheng's Blog</title><link>https://elated-raman-42e0c2.netlify.app/tags/machine_learning/</link><description>Recent content in Machine_learning on Sawyer Zheng's Blog</description><generator>Hugo</generator><language>zh-cn</language><lastBuildDate>Mon, 24 Feb 2025 12:42:11 +0800</lastBuildDate><atom:link href="https://elated-raman-42e0c2.netlify.app/tags/machine_learning/index.xml" rel="self" type="application/rss+xml"/><item><title>Bayesian Optimization</title><link>https://elated-raman-42e0c2.netlify.app/post/notes/ai/bayesian_optimization/</link><pubDate>Tue, 19 Mar 2024 00:00:00 +0000</pubDate><guid>https://elated-raman-42e0c2.netlify.app/post/notes/ai/bayesian_optimization/</guid><description>&lt;div id="outline-container-headline-1" class="outline-2"&gt;
&lt;h2 id="headline-1"&gt;
参考
&lt;/h2&gt;
&lt;div id="outline-text-headline-1" class="outline-text-2"&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://mp.weixin.qq.com/s?__biz=MzI2MDE1MDg4Mw==&amp;amp;mid=2656708131&amp;amp;idx=3&amp;amp;sn=f78fb3f91a6ae25b675ab72edbee1e75&amp;amp;chksm=f1c0852fc6b70c394f1d95e7716abf9be88b3596cb6bf2066220af63d3a67fbf283a0df225af&amp;amp;scene=27"&gt;数学之美——全概率公式 贝叶斯公式&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;贝叶斯条件概率&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.jos.org.cn/jos/article/pdf/5607"&gt;贝叶斯优化方法和应用综述&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://distill.pub/2020/bayesian-optimization/"&gt;Exploring Bayesian Optimization&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;许多英文资料和实现库&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div id="outline-container-headline-2" class="outline-2"&gt;
&lt;h2 id="headline-2"&gt;
调参优化算法有哪些
&lt;/h2&gt;
&lt;div id="outline-text-headline-2" class="outline-text-2"&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://www.cnblogs.com/suanfajin/p/18292979"&gt;算法金 | 最难的来了：超参数网格搜索、贝叶斯优化、遗传算法、模型特异化、Hyperopt、Optuna、多目标优化、异步并行优化 - 算法金「全网同名」…&lt;/a&gt;&lt;/p&gt;</description></item><item><title>Svm 支持向量机</title><link>https://elated-raman-42e0c2.netlify.app/post/notes/ai/svm_%E6%94%AF%E6%8C%81%E5%90%91%E9%87%8F%E6%9C%BA/</link><pubDate>Wed, 06 Mar 2024 00:00:00 +0000</pubDate><guid>https://elated-raman-42e0c2.netlify.app/post/notes/ai/svm_%E6%94%AF%E6%8C%81%E5%90%91%E9%87%8F%E6%9C%BA/</guid><description>&lt;div id="outline-container-headline-1" class="outline-2"&gt;
&lt;h2 id="headline-1"&gt;
参考
&lt;/h2&gt;
&lt;div id="outline-text-headline-1" class="outline-text-2"&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://zhuanlan.zhihu.com/p/49331510"&gt;看了这篇文章你还不懂SVM你就来打我 - 知乎&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://blog.csdn.net/v_JULY_v/article/details/7624837"&gt;支持向量机通俗导论（理解SVM的三层境界）-CSDN博客&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div id="outline-container-headline-2" class="outline-2"&gt;
&lt;h2 id="headline-2"&gt;
概念
&lt;/h2&gt;
&lt;div id="outline-text-headline-2" class="outline-text-2"&gt;
&lt;ol&gt;
&lt;li&gt;硬间隔&lt;/li&gt;
&lt;li&gt;软间隔&lt;/li&gt;
&lt;li&gt;什么是支持向量&lt;/li&gt;
&lt;li&gt;分类和回归上的区别&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
&lt;/div&gt;</description></item></channel></rss>