Bitcoin price checking/playing became the world wide, morning/all day activity of all white collars. As a data scientist, I think real latent Bitcoin function is white noise(returns), random walk(price), totally real function seems very complicated.
In this post, I will try to show some simple/base case for Bitcoin prediction. At past…
In this post I will make Topic Modelling both with LDA (Latent Dirichlet Allocation, which is designed for this purpose) and using word embedding. I will try to apply Topic Modeling for different combination of algorithms(TF-IDF, LDA and Bert) with different dimension reductions(PCA, TSNE, UMAP).
In former post I played with SVD for Recommendation. Now I will play with heartbreaking “Surprise library”. They make everything a few lines, so soon our job could be a excel plugin.
The code if at github. ( Link )
In this post, I will be using Surpise and compare…
In this post I will try to visualize what SVD does for us. I will try to predict my personal likes with algorithm. I will use standard dataset MovieLens. This will be series of post for Recommender Systems with:
Surprise Recommender Library
In this post, I will try to show simple usage and training of GPT-2. I assume you have basic knowledge about GPT-2. GPT is a auto-regressive Language model. It can generate text for us with it’s huge pretrained models. I want to fine tune GPT-2 so that it generates better…
In this post, I will show simple XLNet usage and try to explain with simple visualizations. There are already pretty nice tutorials. So here I assume you have basic knowledge about XLNet. Like my other posts(attention,Bert) I try to show meaningful examples with simplest dataset possible.