My try on the famous Titanic challenge from Kaggle. As a first Step I explored and analysed the given Dataset (see Feature Engineering notebook). Afterwards, I did a principle component analysis to extract useful features. As a last step I used a support vector machine with radial basis function as a kernel to predict who will die and who will survive on the Titanic. I scored 0.74641 on Kaggle which is pretty decent for the time that I've invested. This could be optimized by advanced methods of feature engineering like finding families from names for example. After my Submission I looked at some of the solutions on the Kaggle Site and found that a decision tree or a Random Forest might have lead to better results.
Most of the Kernels I've found on Kaggle Score around 0.8, so with 0.746 I'm more than happy. Even the "Titantic Mega Model" scored only 0.842 and I'm sure this took a really long time to put together.