Titanic Survival Demo

Titanic Survival Demo

Blocks

1

Annotated

No

Author

Mage

Updated

2/10/25

Was it the lifeboat, sheer luck, or a first-class ticket? This example pipeline explores the infamous Titanic disaster through data, extracting key features, training a predictive model, and deploying an online inference endpoint to determine survival probabilities.

Pipeline Overview

Step 1: Load the Titanic Dataset – Import historical passenger data, including age, class, fare, and other features that influenced survival rates.

Step 2: Feature Engineering – Extract meaningful insights, such as family size, cabin location, and ticket class, to improve model accuracy. Because let’s be honest—first-class passengers had a much better shot.

Step 3: Train the Survival Model – Use machine learning to identify survival patterns based on historical outcomes. The iceberg may have been unpredictable, but your model won’t be.

Step 4: Deploy an Inference Endpoint – Set up a real-time prediction service where you can input passenger details and get instant survival odds. Think of it as a modern-day fortune teller — but powered by data, not mysticism.

This pipeline serves as a hands-on introduction to data preprocessing, model training, and deployment within Mage. Test your predictions, tweak the model, and see if you would have made it aboard a lifeboat!

PIPELINE

PIPELINE

Useful guides

Develop dbt in Mage

Develop dbt in Mage

dbt sources and upstream dependencies

dbt sources and upstream dependencies

dbt variable interpolation

dbt variable interpolation

Running a dbt model

Running a dbt model

Serving dbt docs in production

Serving dbt docs in production

Running dbt snapshots

Running dbt snapshots