Queensland Üniversitesi‘ne (The University of Queensland) Victor Chernozhukov‘un Double Machine Learning isimli metodolojisi üzerine çalışmak üzere geldim (Yakın zamanda yaptığım bu çalışmayı ve detaylarını buradan paylaşacağım.). Burada geçirdiğim yaklaşık 2 aylık süre boyunca Avustralya ve özelde Brisbane şehri hakkında birçok tecrübem oldu. Bu yazıda genelden özele doğru dikkatimi çeken hususlarda aldığım notları paylaşacağım.
[Read More]
Kitap Notu - The Ways of Paradox and Other Essays
Willard Van Orman Quine
Aşağıdaki yazıWillard Van Orman Quine‘in The Ways of Paradox and Other Essays isimli kitabının ilk bölümünü okurken aldığım notlardır.
[Read More]
Tuning ML Hyperparameters - LASSO and Ridge Examples
sklearn.model_selection.GridSearchCV
As far as I see in articles and in Kaggle competitions, people do not bother to regularize hyperparameters of ML algorithms, except of neural networks. One tests several ML algorithms and pick up the best using cross-validation or other methods.
[Read More]
Preprocessing for Neural Networks - Normalization Techniques
Scaling, standardization, and so on
I mentioned about a critical preprocessing tool for Lasso in my last post. Today I will write about preprocessing for Neural Networks.
[Read More]
Preprocessing for LASSO
Polynomial Features
I have been working on Double Machine Learning methodology at The University of Queensland since three weeks. Therefore, I am spending most of my time with ML estimations and I am trying to enhance their performance. I tried many preprocessing tools for each ML algorithms separately. I had noticed that, adding polynomial features yields too much performance increase for LASSO. In my case, even 1% performance increases matter, but adding polynomial features increases performance between 10-25% depending on the estimation type. I believe adding polynomial features matters especially if data contains more nonlinear covariates.
[Read More]