RA
RandomWalkOrg/ML_Kaggle_Titanic
Test using LightGBM and FastTree models with GPU acceleration in C#/.NET via ML.NET.
ML_Kaggle_Titanic
Why
The main reason for doing this is to compare building an ML model in C#/.NET vs Python.
Overview
This is a project to predict the survival of passengers on the Titanic.
The dataset is from Kaggle and the goal is to predict whether a passenger survived or not.
The dataset contains 891 rows and 12 columns.
Comparison of LightGBM and FastTree and a deep learning ensemble model consisting of:
SDCA Logistic Regression
FastForest
LBFGS Logistic Regression
Usage of Nvidia GPU for training also tested, although on this trivial amount of data it is not needed.
Example Output
Model Comparison Results:
Model Type Accuracy AUC F1 Score Training Time
Standard LightGBM 0.8799 0.9585 0.8366 0.20s
FastTree 0.8878 0.9604 0.8471 0.21s
Deep Learning 0.9910 0.9999 0.9883 0.88s
