<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
	<title>TTyb's Blog</title>
	<description>在巴甫洛夫条件反射试验中：给定一条狗，每次摇铃后喂食，足够次数后，狗则听到铃声将会习惯性的分泌唾液，由此引发对铃声的依恋。延伸到实际，给定一个喜欢的妹子，每次见面赠与巴甫洛夫式的礼品或者零食，由此引发妹子的依恋。引入薛定谔的猫理论，在未表白前，妹子与自己一直处于一种“概率云”的状态，一旦表白则“概率云”将消失成为实际。在巴甫洛夫式未表白时，自己与妹子的关系为“既是恋人又不是恋人”的矛盾体。返回巴甫洛夫式试验中，在妹纸形成足够的依恋过后，则可以打破“概率云”的状态，这个谜一样的自己，这一刻薛定谔附体，带着量子论般深沉的哀愁，让她从此不能自拔! 自此创作巴甫洛夫把妹法和薛定谔把妹法，深藏功与名。</description>
	<link>https://ttyb.github.io/</link>
	<atom:link href="https://ttyb.github.io/feed.xml" rel="self" type="application/rss+xml"/>
	
	
	<item>
		<title>spark DenseVector to SparseVector</title>
		<description>在使用 `import org.apache.spark.ml.feature.VectorAssembler` 转换特征后，想要放入 `import org.apache.spark.mllib.classification.SVMWithSGD` 去训练的时候出现错误</description>
		<pubDate>Fri, 05 Jul 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/scala/spark-DenseVector-to-SparseVector.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/scala/spark-DenseVector-to-SparseVector.html</guid>

		
			<category>scala</category>
		
	</item>
	
	<item>
		<title>Python3 打包exe</title>
		<description>Python3 打包exe方式</description>
		<pubDate>Mon, 13 May 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/python/Python3-%E6%89%93%E5%8C%85exe.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/python/Python3-%E6%89%93%E5%8C%85exe.html</guid>

		
			<category>python</category>
		
	</item>
	
	<item>
		<title>基于漫威系列电影好看程度排序</title>
		<description>《漫威》系列电影中，距离《复仇者联盟4》上映一周，豆瓣分都是8.1分的钢铁侠》和《复仇者联盟3》，《钢铁侠》有353695人评价打分，《复仇者联盟3》有557491人评价打分，这两部电影是否一样好看？</description>
		<pubDate>Mon, 29 Apr 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/ml/%E5%9F%BA%E4%BA%8E%E6%BC%AB%E5%A8%81%E7%B3%BB%E5%88%97%E7%94%B5%E5%BD%B1%E5%A5%BD%E7%9C%8B%E7%A8%8B%E5%BA%A6%E6%8E%92%E5%BA%8F.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/ml/%E5%9F%BA%E4%BA%8E%E6%BC%AB%E5%A8%81%E7%B3%BB%E5%88%97%E7%94%B5%E5%BD%B1%E5%A5%BD%E7%9C%8B%E7%A8%8B%E5%BA%A6%E6%8E%92%E5%BA%8F.html</guid>

		
			<category>ML</category>
		
	</item>
	
	<item>
		<title>淘宝天猫商品库存抓取分析</title>
		<description>昨天收到公众号粉丝的爬虫需求:抓取平台：天猫或者淘宝;爬取对象：某个商品的各分类的价格和库存数</description>
		<pubDate>Fri, 19 Apr 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/python/%E6%B7%98%E5%AE%9D%E5%A4%A9%E7%8C%AB%E5%95%86%E5%93%81%E5%BA%93%E5%AD%98%E6%8A%93%E5%8F%96%E5%88%86%E6%9E%90.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/python/%E6%B7%98%E5%AE%9D%E5%A4%A9%E7%8C%AB%E5%95%86%E5%93%81%E5%BA%93%E5%AD%98%E6%8A%93%E5%8F%96%E5%88%86%E6%9E%90.html</guid>

		
			<category>python</category>
		
	</item>
	
	<item>
		<title>flask下载excel</title>
		<description>在flask页面下载excel</description>
		<pubDate>Thu, 28 Mar 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/python/flask%E4%B8%8B%E8%BD%BDexcel.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/python/flask%E4%B8%8B%E8%BD%BDexcel.html</guid>

		
			<category>python</category>
		
	</item>
	
	<item>
		<title>DataFrameNaFunctions无fill方法</title>
		<description>java.lang.NoSuchMethodError: org.apache.spark.sql.DataFrameNaFunctions.fill(JLscala/collection/Seq;)Lorg/apache/spark/sql/Dataset</description>
		<pubDate>Wed, 06 Mar 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/scala/DataFrameNaFunctions%E6%97%A0fill%E6%96%B9%E6%B3%95.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/scala/DataFrameNaFunctions%E6%97%A0fill%E6%96%B9%E6%B3%95.html</guid>

		
			<category>scala</category>
		
	</item>
	
	<item>
		<title>TFIDF结果含义</title>
		<description>import org.apache.spark.ml.feature.{HashingTF, IDF}库中，TFIDF结果的字段含义</description>
		<pubDate>Mon, 28 Jan 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/scala/TFIDF%E7%BB%93%E6%9E%9C%E5%90%AB%E4%B9%89.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/scala/TFIDF%E7%BB%93%E6%9E%9C%E5%90%AB%E4%B9%89.html</guid>

		
			<category>scala</category>
		
	</item>
	
	<item>
		<title>spark读写CSV</title>
		<description>spark读CSV为Dataframe和spark将Dataframe写入CSV</description>
		<pubDate>Tue, 15 Jan 2019 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/scala/spark%E8%AF%BB%E5%86%99csv.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/scala/spark%E8%AF%BB%E5%86%99csv.html</guid>

		
			<category>scala</category>
		
	</item>
	
	<item>
		<title>pyspark列合并为一行</title>
		<description>将dataframe利用pyspark列合并为一行，类似于sql的GROUP_CONCAT函数。spark和pyspark的方式不能共用</description>
		<pubDate>Sat, 29 Dec 2018 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/python/pyspark%E5%88%97%E5%90%88%E5%B9%B6%E4%B8%BA%E4%B8%80%E8%A1%8C.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/python/pyspark%E5%88%97%E5%90%88%E5%B9%B6%E4%B8%BA%E4%B8%80%E8%A1%8C.html</guid>

		
			<category>python</category>
		
	</item>
	
	<item>
		<title>链家二手房楼盘爬虫</title>
		<description>想看下最近房价是否能入手，抓取链家二手房、新房的信息，发现广州有些精装修88平米的3房2厅首付只要 29 万，平均 1.1万/平，果然钱不够信息来凑，总能发现便宜的房子！</description>
		<pubDate>Fri, 21 Dec 2018 00:00:00 +0000</pubDate>
		
		<link>https://ttyb.github.io/python/%E9%93%BE%E5%AE%B6%E4%BA%8C%E6%89%8B%E6%88%BF%E6%A5%BC%E7%9B%98%E7%88%AC%E8%99%AB.html</link>
		<guid isPermaLink="true">https://ttyb.github.io/python/%E9%93%BE%E5%AE%B6%E4%BA%8C%E6%89%8B%E6%88%BF%E6%A5%BC%E7%9B%98%E7%88%AC%E8%99%AB.html</guid>

		
			<category>python</category>
		
	</item>
	
</channel>
</rss>