Automata is Hongyu Su’s blog on machine learning and data science.
More information about this blog, its kith (blogroll, bookmarks, etc.), and a complete archive of past posts, are available via links at the top of the page.
A feed of the most recent posts is also available.
Deep Sentiment Prediction as Web Service View Comments
I have been thinking for a long while to build a web service for sentiment analysis, the idea of which is tell the emotional positivity (negativity) given a piece of text. Despite of the potentially huge and interesting applications or use-cases, we will be focusing the sentiment analysis for tweets. Basically this article is telling what happened and how.
Spark on time series preference data View Comments
To be more general here in the introduction, the situation is that we have a user-item preference matrix which is also evolving over time. Essentially, we have a collection of user-item preference matrices, one for each time point. The preference matrix can be, for example, user’s preference on a collection of books, popularity of movies among people, effectiveness of a set keywords on a collection of campaigns. The prediction task is really to forecast a user-item preference matrix of the next time point.
GPU computation on Amazon EC2 View Comments
Running a deep learning algorithm properly is not a big deal. We discuss the setting that allows us to run a deep learning algorithm, in particular neural stype on Amazon GPU instances.
2015年NIPS会议中酷炫的东西 - Neural Style View Comments
NIPS是理论机器学习人工智能领域顶级的学术会议。NIPS论文的接受率相比其他顶级机器学习会议(ICML,AISTATS,ICCV,CVPR)略低。NIPS会议偏好一些理论性很强的工作,开创性和前瞻性的工作,以及里程碑性的工作。自己也曾被导师邀请审过历年NIPS的文章,审稿过程中最经常问自己的问题就是,手上这篇文章是原创性的还是建立在之前工作基础上的,跟后者沾边的文章基本夭折。有些刚刚步入机器学习和人工智能的小伙伴可能会有疑问,为什NIP文章的引用率会比其他机器学习会议文章引用率偏低,其实原因很简单嘛,因为大多数人都读不懂NIPS文章,看不懂怎么去引用啊。翻开就是一页一页数学公式的学术论文,我用真心向明月啊,我读你知道我缺不知道啊。不过我认为数学向来以简洁准确通用著称,我想这也是NIPS的魅力之一吧。身为一个机器学习人工智能的从(lan)业(ling)人(gong)员(ren),去年12月我也飞去这个又黑又冷的蒙特利尔,去让自己的智商被无数次的凌辱,凌辱,凌辱。在接下来的几个帖子里面,我会陆续总(zi)结(can)2015年NIPS的点点滴滴。
Cool stuff in NIPS 2015 (symposium) - Neural Style View Comments
The deep learning algorithm, Neural style, is also known as neural art. Some similar algorithmic techniques have been seen in so called deep dream. It is a recent work in the filed of deep learning, and of course it’s super cool. The algorithm has been there for a few months already and I have noticed it for a while. Let’s take a close look at technology behind the scene.