Entropy and Getting Hired

In information theory, there is a concept called Entropy. If you want a good refresher, I would highly recommend the article by Naoki .
Basically, Low Entropy means receiving very predictable information.
High entropy means receiving very unpredictable information (some also define it as an element of surprise).
I have been interviewing data scientists for past 3–4 yrs and I must say most data science aspirants’ resumes have low entropy.
Everyone has the same projects :
- Titanic survivors prediction
- Heart disease prediction
- Home loan prediction
- Credit default prediction
- Same Kaggle exercises
- Some object detection exercise
- Predicting stock market via forecasting (LSTM)
If you want to improve your odds of getting hired for Internship or full time positions, I would highly recommend increasing the entropy of your resume.
I know your natural question would be, “How do I do that?”
Well for starters, don’t do what everyone does.
Instead of putting the run-of-the-mill examples cited above:
- Perhaps do your own pet project
- Collect your own data (need not be huge),
- Study the data thoroughly (talk about the data generating distribution)
- Present the insights (talk about the outliers and if they really contain some signal)
- Apply the appropriate statistical technique or ML algorithm.
Finally,
- Don’t just talk abt the accuracy metrics .
- Talk about how your solution helped you solve the problem and the tangible benefits you gained.
And just like that you have increased the entropy of your resume.
Your resume now has an “element of surprise” for the recruiter and not to mention you also have improved your odds of getting hired !!
For Data Science Consulting and Solutions;
Get in touch with us at:
Website: https://www.arymalabs.com/