View profile

Kaggle is now over!

Kaggle is now over!
By Santiago • Issue #26 • View online
My Kaggle competition finished, and I ended in the Top 4% of participants.
That makes me happy!
Overall, I’m definitely looking forward to my next competition, and I’m going to recommend you look into it too.
It was really fun!

I didn't end in any of the first three places, but the image is pretty.
I didn't end in any of the first three places, but the image is pretty.
K-Fold Cross-Validation
Do you know how and when to use Cross-validation?
Here is a quick introduction to it.
2 minutes of your time, and I hope this is clearer than it has ever been.
Santiago
If your machine learning model sucks, I have an idea for you.

Thread: Quick introduction to k-fold cross-validation and how you can use it.
More than one model
Here is an interesting approach to solving a problem: You start with a dataset, slice it out, and build multiple models for each different slice.
I’ve found that this is especially useful when fairness is a concern.
Here is how you can do it.
Santiago
What can you do when your machine learning model stops improving?

There's always a point when you hit the ceiling and the performance of the model stalls.

Thread: A couple of tricks to improve your model.
More about that Kaggle competition
In case you are looking for a little bit more information:
Santiago
I finished my first Kaggle competition and scored in the top 4% of participants.

I learned a few valuable lessons. Here they are: ↓
Did you enjoy this issue?
Santiago

I'll send you an email whenever I publish something important. I promise you don't want to miss this.

In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue